This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Registration Using Natural Features for Augmented Reality Systems
July/August 2006 (vol. 12 no. 4)
pp. 569-580

Abstract—Registration is one of the most difficult problems in augmented reality (AR) systems. In this paper, a simple registration method using natural features based on the projective reconstruction technique is proposed. This method consists of two steps: embedding and rendering. Embedding involves specifying four points to build the world coordinate system on which a virtual object will be superimposed. In rendering, the Kanade-Lucas-Tomasi (KLT) feature tracker is used to track the natural feature correspondences in the live video. The natural features that have been tracked are used to estimate the corresponding projective matrix in the image sequence. Next, the projective reconstruction technique is used to transfer the four specified points to compute the registration matrix for augmentation. This paper also proposes a robust method for estimating the projective matrix, where the natural features that have been tracked are normalized (translation and scaling) and used as the input data. The estimated projective matrix will be used as an initial estimate for a nonlinear optimization method that minimizes the actual residual errors based on the Levenberg-Marquardt (LM) minimization method, thus making the results more robust and stable. The proposed registration method has three major advantages: 1) It is simple, as no predefined fiducials or markers are used for registration for either indoor and outdoor AR applications. 2) It is robust, because it remains effective as long as at least six natural features are tracked during the entire augmentation, and the existence of the corresponding projective matrices in the live video is guaranteed. Meanwhile, the robust method to estimate the projective matrix can obtain stable results even when there are some outliers during the tracking process. 3) Virtual objects can still be superimposed on the specified areas, even if some parts of the areas are occluded during the entire process. Some indoor and outdoor experiments have been conducted to validate the performance of this proposed method.

[1] R.T. Azuma, “A Survey of Augmented Reality,” Presence: Teleoperators and Virtual Environments, vol. 6, no. 4, pp. 355-385, 1997.
[2] R. Azuma, Y. Baillot, R. Behringer, S. Feiner, S. Julier, and B. MacIntyre, “Recent Advances in Augmented Reality,” IEEE Computer Graphics and Applications, vol. 21, no. 6, pp. 34-47, Nov.-Dec. 2001.
[3] M. Rosenthal, A. State, H. Lee, G. Hirota, J. Ackerman, K. Keller, E.D. Pisano, M. Jiroutek, K. Muller, and H. Fuchs, “Augmented Reality Guidance for Needle Biopsies: An Initial Randomized, Controlled Trial in Phantoms,” Medical Image Analysis, vol. 6, no. 3, pp. 313-320, 2002.
[4] S. Julier, M. Lanzagorta, Y. Baillot, L. Rosenblum, and S. Feiner, “Information Filtering for Mobile Augmented Reality,” Proc. IEEE Int'l Symp. Augmented Reality, pp. 3-11, 2000.
[5] M. Utsumi, “Development for Teleoperation Underwater Grasping System in Unclear Environment,” Proc. 2002 Int'l Symp. Underwater Technology, pp. 349-353, Apr. 2002.
[6] B. MacIntyre, M. Lohse, J.D. Bolter, and E. Moreno, “Integrating 2-D Video Actors into 3-D Augmented-Reality Systems,” Presence, vol. 11, no. 2, pp. 189-20, 2002.
[7] W. Barfield, K. Baird, J. Shewchuk, and G. Ioannou, “Applications of Wearable Computers and Augmented Reality to Manufacturing,” Fundamentals of Wearable Computers and Augmented Reality, pp. 695-724, 2001.
[8] W.A. Hoff and K. Nguyen, “Computer Vision-Based Registration Techniques for Augmented Reality,” Proc. Intelligent Robots and Computer Vision XV, SPIE, vol. 2904, pp. 538-548, 1996.
[9] R. Behringer, “Registration for Outdoor Augmented Reality Applications Using Computer Vision Techniques and Hybrid Sensors,” Proc. Virtual Reality Ann. Int'l Symp., pp. 244-251, 1999.
[10] Y. Suya, U. Neumann, and R. Azuma, “Hybrid Inertial and Vision Tracking for Augmented Reality Registration,” Proc. IEEE Symp. Virtual Reality (VR '99), pp. 260-267, 1999.
[11] R. Behringer, “Registration for Outdoor Augmented Reality Applications Using Computer Vision Techniques and Hybrid Sensors,” Proc. Virtual Reality Ann. Int'l Symp., pp. 244-251, 1999.
[12] K. Satoh, M. Anabuki, H. Yamamoto, and H. Tamura, “A Hybrid Registration Method for Outdoor Augmented Reality,” Proc. IEEE and ACM Int'l Symp. Augmented Reality, pp. 67-76, 2001.
[13] L. Chai, W.A. Hoff, and T. Vincent, “Three-Dimensional Motion and Structure Estimation Using Inertial Sensors and Computer Vision for Augmented Reality,” Presence: Teleoperators and Virtual Environments, vol. 11, no. 5, pp. 474-492, 2002.
[14] H. Kato and M. Billinghurst, “Marker Tracking and HMD Calibration for a Video-based Augmented Reality Conferencing System,” Proc. Second IEEE and ACM Int'l Workshop Augmented Reality, pp. 85-94 Oct. 1999.
[15] P. Sinclair, “Integrating Hypermedia Techniques with Augmented Reality Environments,” PhD thesis, Univ. of Southampton, 2004.
[16] Y.D. Seo and K.S. Hong, “Calibration-Free Augmented Reality in Perspective,” IEEE Trans. Visualization and Computer Graphics, vol. 6, no. 4, pp. 346-359, Nov.-Dec. 2000.
[17] R. Vijaimukund, J. Molineros, and R. Sharma, “Interactive Evaluation of Assembly Sequence Using Augmented Reality,” IEEE Trans. Robotics and Automation, vol. 15, no. 3, pp. 435-449, 1999.
[18] S.J.D. Prince, K. Xu, and A.D. Cheok, “Augmented Reality Camera Tracking with Homographies,” IEEE Computer Graphics and Applications, vol. 22, no. 6, pp. 39-45, 2002.
[19] G. Simon and M.O. Berger, “Pose Estimation for Planar Structure,” IEEE Computer Graphics and Applications, vol. 22, no. 6, pp. 46-53, 2002.
[20] U. Neumann and Y. Suya, “Natural Feature Tracking for Augmented Reality,” IEEE Trans. Multimedia, vol. 1, no. 1, pp. 53-64, 1999.
[21] A.I. Comport, E. Marchand, and F. Chaumette, “A Real-Time Tracker for Markerless Augmented Reality,” Proc. IEEE and ACM Int'l Symp. Mixed and Augmented Reality, pp. 36-45, 2003.
[22] V. Ferrari, T. Tuytelaars, and L. Van Gool, “Markerless Augmented Reality with a Real-Time Affine Region Tracker,” Proc. IEEE and ACM Int'l Symp. Augmented Reality, pp. 87-96, 2001.
[23] J. Shi and C. Tomasi, “Good Features to Track,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, pp. 593-600, 1994.
[24] M.L. Yuan, S.K. Ong, and A.Y. C. Nee, “Registration Using Projective Reconstruction Technique for Augmented Reality Systems,” IEEE Trans. Visualization and Computer Graphics, vol. 11, no. 3, pp. 254-264, May-June 2005.
[25] N. Kiriakos and J.R. Vallino, “Calibration-Free Augmented Reality,” IEEE Trans. Visualization and Computer Graphics, vol. 4, no. 1, pp. 1-20, Oct.-Dec. 1998.
[26] C. Tomasi and T. Kanade, “Detection and Tracking of Point Features,” Technical Report CMU-CS-91-132, Carnegie Mellon Univ., 1991.
[27] G.D. Hager and P.N. Belhumeur, “Real-Time Tracking of Image Regions with Changes in Geometry and Illumination,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, pp. 403-410, 1996.
[28] B. Georgescu and P. Meer, “Point Matching under Larger Image Deformations and Illumination Changes,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 26, pp. 674-689, 2004.
[29] T. Tommasini, A. Fusiello, E. Trucco, and V. Roberto, “Making Good Features Track Better,” Proc. IEEE Int'l Conf. Computer Vision and Pattern Recognition, pp. 178-183, 1998.
[30] K. Shafique and M.A. Shah, “Noniterative Greedy Algorithm for Multiframe Point Correspondence,” Proc. Int'l Conf. Computer Vision, pp. 110-115, 2003.
[31] H. Jin, P. Favaro, and S. Soatto, “Real-Time Feature Tracking and Outlier Rejection with Changes in Illumination,” Proc. Int'l Conf. Computer Vision, pp. 684-689, 2001.
[32] Y. Genc, S. Riedel, F. Souvannavong, C. Akinlar, and N. Navab, “Marker-Less Tracking for AR: A Learning-Based Approach,” Proc. IEEE and ACM Int'l Symp. Augmented Reality, pp. 295-305, 2002.
[33] R.I. Hartley, “In Defense of the Eight-Point Algorithm,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 19, no. 6, pp. 580-593, June 1997.
[34] Z. Zhang, R. Deriche, O. Faugeras, and Q.T. Luong, “A Robust Technique for Matching Two Uncalibrated Images through the Recovery of the Unknown Epipolar Geometry,” Artificial Intelligence J., vol. 78, pp. 87-119, 1995.
[35] G.H. Golub and C.F. Van Loan, Matrix Computations. The Johns Hopkins Univ. Press, 1989.
[36] B. Jiang, S. You, and U. Neumann, “A Robust Tracking System for Outdoor Augmented Reality,” Proc. IEEE Virtual Reality Conf., pp. 3-10, 2004.
[37] D. Koller, G. Klinker, E. Rose, D. Breen, R. Whitaker, and M. Tuceryan, “Real-Time Vision-Based Camera Tracking for Augmented Reality Applications,” Proc. ACM Virtual Reality Software and Technology Conf., pp. 87-94, 1997.
[38] M. Ribo, P. Lang, H. Ganster, M. Brandner, C. Stock, and A. Pinz, “Hybrid Tracking for Outdoor Augmented Reality Applications,” IEEE Computer Graphics and Applications, vol. 22, no. 6, pp. 54-63, 2002.

Index Terms:
Augmented reality, registration, projective reconstruction, natural feature tracking.
Citation:
M.L. Yuan, S.K. Ong, A.Y.C. Nee, "Registration Using Natural Features for Augmented Reality Systems," IEEE Transactions on Visualization and Computer Graphics, vol. 12, no. 4, pp. 569-580, July-Aug. 2006, doi:10.1109/TVCG.2006.79
Usage of this product signifies your acceptance of the Terms of Use.