The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.03 - May/June (2008 vol.14)
pp: 500-512
ABSTRACT
Anywhere Augmentation pursues the goal of lowering the initial investment of time and money necessary to participate in mixed reality work, bridging the gap between researchers in the field and regular computer users. Our paper contributes to this goal by introducing the GroundCam, a cheap tracking modality with no significant setup necessary. By itself, the GroundCam provides high frequency, high resolution relative position information similar to an inertial navigation system, but with significantly less drift. We present the design and implementation of the GroundCam, analyze the impact of several design and run-time factors on tracking accuracy, and consider the implications of extending our GroundCam to different hardware configurations. Motivated by the performance analysis, we developed a hybrid tracker that couples the GroundCam with a wide area tracking modality via a complementary Kalman filter, resulting in a powerful base for indoor and outdoor mobile mixed reality work. To conclude, the performance of the hybrid tracker and its utility within mixed reality applications is discussed.
INDEX TERMS
Virtual reality, Motion, Tracking, Motion
CITATION
Stephen DiVerdi, "Heads Up and Camera Down: A Vision-Based Tracking Modality for Mobile Mixed Reality", IEEE Transactions on Visualization & Computer Graphics, vol.14, no. 3, pp. 500-512, May/June 2008, doi:10.1109/TVCG.2008.26
REFERENCES
[1] T. Höllerer, J. Wither, and S. DiVerdi, ““Anywhere Augmentation”: Towards Mobile Augmented Reality in UnpreparedEnvironments,” Location Based Services and TeleCartography, Lecture Notes in Geoinformation and Cartography, G. Gartner, M.Peterson, and W. Cartwright, eds., Springer, pp. 393-416, Feb. 2007.
[2] Polhemus, FastTrack, http:/www.polhemus.com/, Sept. 2006.
[3] C. Randell and H. Muller, “Low Cost Indoor Positioning System,” Proc. Ubiquitous Computing, pp. 42-48, 2001.
[4] B. Barshan and H. Durrant-Whyte, “Inertial Navigation Systems for Mobile Robots,” Trans. Robotics and Automation, vol. 11, no. 3, pp. 328-342, 1995.
[5] C. Randell, C. Djiallis, and H. Muller, “Personal Position Measurement Using Dead Reckoning,” Proc. Int'l Symp. Wearable Computers, pp. 166-173, 2003.
[6] WorldViz, Precision Position Tracker, http:/www.worldviz.com/, Sept. 2006.
[7] I. Poupyrev, D. Tan, M. Billinghurst, H. Kato, H. Regenbrecht, and N. Tetsutani, “Developing a Generic Augmented-Reality Interface,” Computer, vol. 35, no. 3, pp. 44-50, Mar. 2002.
[8] A. Davison, “Real-Time Simultaneous Localisation and Mapping with a Single Camera,” Proc. IEEE Int'l Conf. Computer Vision (ICCV'03), Oct. 2003.
[9] E. Foxlin and L. Naimark, “VIS-Tracker: A Wearable Vision-Inertial Self-Tracker,” Proc. Virtual Reality, pp. 199-206, 2003.
[10] I. Getting, “The Global Positioning System,” IEEE Spectrum, vol. 30, no. 12, pp. 36-47, Dec. 1993.
[11] T. Starner, D. Kirsch, and S. Assefa, “The Locust Swarm: An Environmentally- Powered, Networkless Location and Messaging System,” Proc. Int'l Symp. Wearable Computers, pp. 169-170, 1997.
[12] N. Priyantha, A. Chakraborty, and H. Balakrishnan, “The Cricket Location-Support System,” Proc. Int'l Conf. Mobile Computing and Networking, pp. 32-43, 2000.
[13] P. Bahl and V. Padmanabhan, RADAR: An In-Building RF-Based User Location and Tracking System, vol. 2, pp. 775-784, 2000.
[14] J. Campbell, R. Sukthankar, and I. Nourbakhsh, “Techniques for Evaluating Optical Flow for Visual Odometry in Extreme Terrain,” Proc. Int'l Conf. Intelligent Robots and Systems, vol. 4, pp.3704-3711, 2004.
[15] S. Se, D. Lowe, and J. Little, “Vision-Based Mobile Robot Localization and Mapping Using Scale-Invariant Features,” Proc. Int'l Conf. Robotics and Automation, pp. 2051-2058, 2001.
[16] S. DiVerdi and T. Höllerer, “GroundCam: A Tracking Modality for Mobile Mixed Reality,” Proc. Int'l IEEE Conf. Virtual Reality, pp.75-82, 2007.
[17] A. Haro, K. Mori, T. Capin, and S. Wilkinson, “Mobile Camera-Based User Interaction,” Proc. Int'l Conf. Computer Vision Workshop Human Computer Interaction, pp. 79-89, 2005.
[18] S. Lee and J. Song, “Mobile Robot Localization Using Optical Flow Sensors,” Int'l J. Control, Automation, and Systems, vol. 2, no. 4, pp.485-493, 2004.
[19] D. Nistér, O. Naroditsky, and J. Bergen, “Visual Odometry for Ground Vehicle Applications,” J. Field Robotics, vol. 23, no. 1, 2006.
[20] L. Fang, P. Antsaklis, L. Montestruque, M. McMickell, M. Lemmon, Y. Sun, H. Fang, I. Koutroulis, M. Haenggi, M. Xie, and X. Xie, “Design of a Wireless Assisted Pedestrian Dead Reckoning System—The NavMote Experience,” Trans. Instrumentation and Measurement, vol. 54, no. 6, pp. 2342-2358, 2005.
[21] D. Hallaway, T. Höllerer, and S. Feiner, “Bridging the Gaps: Hybrid Tracking for Adaptive Mobile Augmented Reality,” Applied Artificial Intelligence J., special issue on AI in mobile systems, vol. 18, no. 6, pp. 477-500, 2004.
[22] S. Lee and K. Mase, “A Personal Indoor Navigation System Using Wearable Sensors,” Proc. Int'l Symp. Mixed Reality, pp. 147-148, 2001.
[23] E. Foxlin, “Inertial Head-Tracker Sensor Fusion by a Complementary Separate-Bias Kalman Filter,” Proc. Virtual Reality, pp.184-194, 1996.
[24] S. You and U. Neumann, “Fusion of Vision and Gyro Tracking for Robust Augmented Reality Registration,” Proc. Virtual Reality, pp.71-78, 2001.
[25] B. Jiang, U. Neumann, and S. You, “A Robust Hybrid Tracking System for Outdoor Augmented Reality,” Proc. Virtual Reality, pp.3-10, 2004.
[26] Y. Li and J. Wang, Low-Cost Tightly Coupled GPS/INS Integration Based on a Nonlinear Kalman Filtering Design, Inst. Navigation Nat'l Technical Meeting, 2006.
[27] R. Azuma, Y. Baillot, R. Behringer, S. Feiner, S. Julier, and B. MacIntyre, “Recent Advances in Augmented Reality,” Computer Graphics and Applications, vol. 21, no. 6, pp. 34-47, 2001.
[28] G. Welch and E. Foxlin, “Motion Tracking: No Silver Bullet, but a Respectable Arsenal,” Computer Graphics and Applications, vol. 22, no. 6, pp. 24-38, 2002.
[29] T. Höllerer and S. Feiner, “Mobile Augmented Reality,” Telegeoinformatics:Location-Based Computing and Services, H. Karimi and A. Hammad, eds. Taylor and Francis Books, 2004.
[30] Intel Corporation, Open Source Computer Vision Library Reference Manual, Dec. 2000.
[31] Z. Zhang, “A Flexible New Technique for Camera Calibration,” Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 11, pp.1330-1334, 2000.
[32] J. Shi and C. Tomasi, “Good Features to Track,” Proc. Conf. Computer Vision and Pattern Recognition, 1994.
[33] B. Lucas and T. Kanade, “An Iterative Image Registration Technique with an Application to Stereo Vision,” Proc. Int'l Joint Conf. Artificial Intelligence, pp. 674-679, 1981.
[34] M. Fischler and R. Bolles, “Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography,” Comm. ACM, vol. 24, pp. 381-395, 1981.
[35] R. Boulic, P. Glardon, and D. Thalmann, “From Measurements to Model: The Walk Engine,” Proc. Conf. Optical 3D Measurement Techniques, 2003.
[36] M. Orendurff, A. Segal, G. Klute, J. Berge, E. Rohr, and N. Kadel, “The Effect of Walking Speed on Center of Mass Displacement,” J.Rehabilitation Research and Development, vol. 41, no. 6A, pp. 829-834, Nov./Dec. 2004.
[37] G. Welch and G. Bishop, An Introduction to the Kalman Filter, Siggraph Course Notes, course 8, 2001.
[38] D. Hallaway, T. Höllerer, and S. Feiner, “Coarse, Inexpensive, Infrared Tracking for Wearable Computing,” Proc. Int'l Symp. Wearable Computers, pp. 69-78, 2003.
[39] A. Harter, A. Hopper, P. Steggles, A. Ward, and P. Webster, “TheAnatomy of a Context-Aware Application,” Proc. Int'l Conf. Mobile Computing and Networking, pp. 59-68, 1999.
[40] R. Azuma, “The Challenge of Making Augmented Reality Work Outdoors,” Mixed Reality: Merging Real and Virtual Worlds, Y. Ohta and H. Tamura, eds., Springer, 1999.
25 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool