The Community for Technology Leaders
2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (2014)
Munich, Germany
Sept. 10, 2014 to Sept. 12, 2014
ISBN: 978-1-4799-6184-9
pp: 165-170
Ibai Leizea , CEIT and Tecnun (University of Navarra), Spain
Hugo Alvarez , CEIT and Tecnun (University of Navarra), Spain
Iker Aguinaga , CEIT and Tecnun (University of Navarra), Spain
Diego Borro , CEIT and Tecnun (University of Navarra), Spain
This paper proposes a novel approach to registering deformations of 3D non-rigid objects for Augmented Reality applications. Our prototype is able to handle different types of objects in real-time regardless of their geometry and appearance (with and without texture) with the support of an RGB-D camera. During an automatic offline stage, the model is processed in order to extract the data that serves as input for a physics-based simulation. Using its output, the deformations of the model are estimated by considering the simulated behaviour as a constraint. Furthermore, our framework incorporates a tracking method based on templates in order to detect the object in the scene and continuously update the camera pose without any user intervention. Therefore, it is a complete solution that extends from tracking to deformation formulation for either textured or untextured objects regardless of their geometrical shape. Our proposal focuses on providing a correct visual with a low computational cost. Experiments with real and synthetic data demonstrate the visual accuracy and the performance of our approach.
Three-dimensional displays, Shape, Cameras, Computational modeling, Deformable models, Visualization, Real-time systems,I.6.8 [SIMULATION AND MODELING]: Types of Simulation — [I.2.10]: ARTIFICIAL INTELLIGENCE — Vision and Scene Understanding, I.4.8 [IMAGE PROCESSING AND COMPUTER VISION]: Scene Analysis
Ibai Leizea, Hugo Alvarez, Iker Aguinaga, Diego Borro, "Real-time deformation, registration and tracking of solids based on physical simulation", 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), vol. 00, no. , pp. 165-170, 2014, doi:10.1109/ISMAR.2014.6948423
210 ms
(Ver 3.3 (11022016))