This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
2009 8th IEEE International Symposium on Mixed and Augmented Reality
Robust pose estimation in untextured environments for augmented reality applications
Orlando, FL, USA
October 19-October 22
ISBN: 978-1-4244-5390-0
Wei Guan, Computer Graphics and Immersive Technologies Laboratory, University of Southern California, USA
Lu Wang, Computer Graphics and Immersive Technologies Laboratory, University of Southern California, USA
Jonathan Mooser, Computer Graphics and Immersive Technologies Laboratory, University of Southern California, USA
Suya You, Computer Graphics and Immersive Technologies Laboratory, University of Southern California, USA
Ulrich Neumann, Computer Graphics and Immersive Technologies Laboratory, University of Southern California, USA
We present a robust camera pose estimation approach for stereo images captured in untextured environments. Unlike most of existing registration algorithms which are point-based and make use of intensities of pixels in the neighborhood, our approach imports line segments in registration process. With line segments as primitives, the proposed algorithm is capable to handle untextured images such as scenes captured in man-made environments, as well as the cases when there are large viewpoint changes or illumination changes. Furthermore, since the proposed algorithm is robust to large base-line stereos, there are improvements on the accuracy of 3D points reconstruction. With well-calculated camera pose and object positions in 3D space, we can embed virtual objects into existing scene with higher accuracy for realistic effects. In our experiments, 2D labels are embedded in the 3D scene space to achieve annotation effects as in AR.
Citation:
Wei Guan, Lu Wang, Jonathan Mooser, Suya You, Ulrich Neumann, "Robust pose estimation in untextured environments for augmented reality applications," ismar, pp.191-192, 2009 8th IEEE International Symposium on Mixed and Augmented Reality, 2009
Usage of this product signifies your acceptance of the Terms of Use.