Acta Univ. Agric. Silvic. Mendelianae Brun. 2012, 60(2), 175-180 | DOI: 10.11118/actaun201260020175
Usage of Microsoft Kinect for augmented prototyping speed-up
- Ústav informatiky, Mendelova univerzita v Brně, 613 00 Brno, Česká republika
Physical model is a common tool for testing of the product features during the design process. This model is usually made of clay or plastic because of the modifiability of these materials. Therefore, the designer could easily adjust the model shape to enhance the look or ergonomics of the product. Nowadays, some companies use augmented reality to enhance their design process. This concept is called augmented prototyping. Common approach uses artificial markers to augment the product prototype by digital 3D models. These 3D models that are shown on the markers positions can represent e.g. car spare parts such as different lights, wheels, spoiler etc. This allows the designer interactively change the look of the physical model.
Further, it is also necessary to transfer physical adjustments made on the model surface back to the computer digital model. Well-known tool for this purpose a professional 3D scanner. Nevertheless, the cost of such scanner is substantial. Therefore, we focused on different solution - a motion capture device Microsoft Kinect that is used for computer games. This article outlines a new augmented prototyping approach that directly updates the digital model during the design process using Kinect depth camera. This solution is a cost effective alternative to the professional 3D scanners. Our article describes especially how depth data can be obtained by the Kinect and also provides an evaluation of depth measurement precision.
Keywords: augmented reality, augmented prototyping, Kinect, image-based rendering, prototype
Grants and funding:
This paper is written as a part of a solution of the project IGA FBE MENDELU 31/2011 and FBE MENDELU research plan MSM 6215648904.
Received: November 30, 2011; Published: October 3, 2013 Show citation
References
- RUSU, R. D., COUSINS, S., 2011: 3D is here: Point Cloud Library (PCL). 2011 IEEE International Conference on Robotics and Automation, pp. 1-4.
Go to original source...
- ©«ASTNÝ, J., PROCHÁZKA, D., KOUBEK, T., LANDA, J., 2011: Augmented reality usage for prototyping speed up. Acta universitatis agriculturae Mendelianae Brunensis, 59, 4: 353-360. ISSN 1211-8516.
Go to original source...
- PROCHÁZKA, D., ©TENCL, M., POPELKA, O., ©«ASTNÝ, J., 2011: Mobile Augmented Reality Applications. Mendel 2011: 17th International Conference on Soft Computing. p. 469-476. ISBN 978-80-214-4302-0.
- ZOLLHÖFER, M. et al., 2011: Automatic reconstruction of personalized avatars from 3D face scans. Computer Animation and Virtual Worlds. p. 195-202. ISSN 15464261. DOI: 10.1002/cav.405
Go to original source...
- CUI, Y., STRICKE, D., 2011: 3D Shape Scanning with a Kinect. SIGGRAPH 2011 Posters. ISBN 978-1-4503-0971-4.
Go to original source...
- IZADI, S. et al., 2011: KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera. UIST '11 Proceedings of the 24th annual ACM symposium on User interface software and technology. p. 559-568. ISBN 978-1-4503-0716-1.
Go to original source...
- LI, K. F., LOTHROP, K., GILL, E., LAU, S., 2011: A Web-Based Sign Language Translator Using 3D Video Processing. NBIS '11 Proceedings of the 2011 14th International Conference on Network-Based Information Systems, p. 356-361, ISBN 978-0-7695-4458-8
Go to original source...
- WILSON, A. D., 2010: Using a depth camera as a touch sensor. ITS '10 ACM International Conference on Interactive Tabletops and Surfaces, p. 69-72, ISBN 978-1-4503-0399-6.
Go to original source...
This is an open access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License (CC BY NC ND 4.0), which permits non-comercial use, distribution, and reproduction in any medium, provided the original publication is properly cited. No use, distribution or reproduction is permitted which does not comply with these terms.