The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.01 - January (2011 vol.17)
pp: 3-13
Malcolm Hutson , University of Louisiana at Lafayette, Lafayette
ABSTRACT
Several critical limitations exist in the currently available tracking technologies for fully enclosed virtual reality (VR) systems. While several 6DOF tracking projects such as Hedgehog have successfully demonstrated excellent accuracy, precision, and robustness within moderate budgets, these projects still include elements of hardware that can interfere with the user's visual experience. The objective of this project is to design a tracking solution for fully enclosed VR displays that achieves comparable performance to available commercial solutions but without any artifacts that can obscure the user's view. JanusVF is a tracking solution involving a cooperation of both the hardware sensors and the software rendering system. A small, high-resolution camera is worn on the user's head, but faces backward (180 degree rotation about vertical from the user's perspective). After acquisition of the initial state, the VR rendering software draws specific fiducial markers with known size and absolute position inside the VR scene. These virtual markers are only drawn behind the user and in view of the camera. These fiducials are tracked by ARToolkitPlus and integrated by a single-constraint-at-a-time (SCAAT) filter algorithm to update the head pose. Experiments analyzing accuracy, precision, and latency in a six-sided CAVE-like system show performance that is comparable to alternative commercial technologies.
INDEX TERMS
Virtual reality, input devices and strategies, stereo, tracking.
CITATION
Malcolm Hutson, "JanusVF: Accurate Navigation Using SCAAT and Virtual Fiducials", IEEE Transactions on Visualization & Computer Graphics, vol.17, no. 1, pp. 3-13, January 2011, doi:10.1109/TVCG.2010.91
REFERENCES
[1] B. Jiang, S. You, and U. Neumann, "Camera Tracking for Augmented Reality Media," Proc. IEEE Int'l Conf. Multimedia and Expo (ICME '00), vol. 3, pp. 1637-1640, 2000.
[2] N.T. Rasmussen, M. Störring, T.B. Moeslund, and E. Granum, "Real-Time Tracking for Virtual Environments Using SCAAT Kalman Filtering and Unsynchronized Cameras," Proc. Int'l Conf. Computer Vision Theory and Applications (VISAPP), 2006.
[3] M. Foursa, "Real-Time Infrared Tracking System for Virtual Environments," Proc. ACM SIGGRAPH Int'l Conf. Virtual Reality Continuum and Its Applications in Industry (VRCAI '04), pp. 427-430, 2004.
[4] G. Welch, G. Bishop, L. Vicci, S. Brumback, K. Keller, and D. Colucci, "High-Performance Wide-Area Optical Tracking: The HiBall Tracking System," Presence: Teleoperators and Virtual Environments, vol. 10, no. 1, pp. 1-21, 2001.
[5] A. Vorozcovs, A. Hogue, and W. Stuerzlinger, "The Hedgehog: A Novel Optical Tracking Method for Spatially Immersive Displays," Proc. IEEE Conf. Virtual Reality (VR '05), pp. 83-89, Mar. 2005.
[6] C. Cruz-Neira, D.J. Sandin, T.A. DeFanti, R.V. Kenyon, and J.C. Hart, "The CAVE: Audio Visual Experience Automatic Virtual Environment," Comm. ACM, vol. 35, no. 6, pp. 64-72, 1992.
[7] M. Robinson, J. Laurence, J. Zacher, A. Hogue, R. Allison, L. Harris, M. Jenkin, and W. Stuerzlinger, "IVY: The Immersive Visual Environment at York," citeseer.ist.psu.edurobinson02ivy. html, 2002.
[8] G. Welch and G. Bishop, "SCAAT: Incremental Tracking with Incomplete Information," Proc. ACM SIGGRAPH, pp. 333-344, 1997.
[9] Am. Nat'l Standards Inst. Accredited Standards Committee (ASC Z136), "ANSI Z136.1 and Z136.3," http:/z136.org/, Aug. 2008.
[10] W. Piekarski and B. Thomas, "Using ARToolKit for 3D Hand Position Tracking in Mobile Outdoor Environments," Proc. First IEEE Int'l Workshop Augmented Reality Toolkit, 2002.
[11] S. You, U. Neumann, and R. Azuma, "Hybrid Inertial and Vision Tracking for Augmented Reality Registration," Proc. IEEE Conf. Virtual Reality, pp. 260-267, Mar. 1999.
[12] S. You and U. Neumann, "Fusion of Vision and Gyro Tracking for Robust Augmented Reality Registration," Proc. IEEE Conf. Virtual Reality, pp. 71-78, Mar. 2001.
[13] E. Foxlin and L. Naimark, "VIS-Tracker: A Wearable Vision-Inertial Self-Tracker," Proc. IEEE Conf. Virtual Reality, pp. 199-206, Mar. 2003.
[14] A. Grundhofer, M. Seeger, F. Hantsch, and O. Bimber, "Dynamic Adaptation of Projected Imperceptible Codes," Proc. Sixth IEEE and ACM Int'l Symp. Mixed and Augmented Reality (ISMAR '07), pp. 181-190, Nov. 2007.
[15] Point Grey Research, Inc., "FlyCapture SDK," http://www. ptgrey.com/products/pgrflycapture index.asp, Aug. 2008.
[16] D. Wagner and D. Schmalstieg, "ARToolKitPlus for Pose Tracking on Mobile Devices," Proc. 12th Computer Vision Winter Workshop (CVWW), 2007.
[17] I. Russell, M. Taylor, T.C. Hudson, A. Seeger, H. Weber, J. Juliano, and A.T. Helser, "VRPN: A Device-Independent, Network-Transparent VR Peripheral System," Proc. ACM Symp. Virtual Reality Software and Technology (VRST '01), pp. 55-61, 2001.
[18] A. Bierbaum, C. Just, P. Hartling, K. Meinert, A. Baker, and C. Cruz-Neira, "VR Juggler: A Virtual Platform for Virtual Reality Application Development," Proc. IEEE Conf. Virtual Reality (VR), pp. 89-96, 2001.
[19] Point Grey Research, Inc., "Flea2," http://www.ptgrey.com/ products/flea2index.asp , Aug. 2008.
[20] G. Schweighofer and A. Pinz, "Robust Pose Estimation from a Planar Target," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 28, no. 12, pp. 2024-2030, Dec. 2006.
[21] J.-Y. Bouguet, "Camera Calibration Toolbox for Matlab," http://www.vision.caltech.edu/bouguetjcalib_doc /, May 2008.
[22] H. Kato and M. Billinghurst, "Marker Tracking and HMD Calibration for a Video-Based Augmented Reality Conferencing System," Proc. Second IEEE and ACM Int'l Workshop Augmented Reality (IWAR '99), p. 85, 1999.
[23] C. Owen, F. Xiao, and P. Middlin, "What Is the Best Fiducial?" Proc. First IEEE Int'l Workshop Augmented Reality Toolkit, 2002.
[24] R.E. Kalman, "A New Approach to Linear Filtering and Prediction Problems," Trans. ASME J. Basic Eng., no. 82, pp. 35-45, http://www.cs.unc.edu/welch/kalman/media/ pdfKalman1960.pdf, 1960.
[25] S.J. Julier and J.K. Uhlmann, "A New Extension of the Kalman Filter to Nonlinear Systems," Proc. Int'l Symp. Aerospace/Defense Sensing, Simulation, and Controls, pp. 182-193, 1997.
[26] F. Ababsa, J. Didier, M. Mallem, and D. Roussel, "Head Motion Prediction in Augmented Reality Systems Using Monte Carlo Particle Filters," Proc. 13th Int'l Conf. Artificial Reality and Telexistence (ICAT '03), pp. 83-88, Dec. 2003.
[27] E. Kraft, "A Quaternion-Based Unscented Kalman Filter for Orientation Tracking," Proc. Sixth Int'l Conf. Information Fusion, vol. 1, pp. 47-54, 2003.
[28] J.J. LaViola,Jr., "A Comparison of Unscented and Extended Kalman Filtering for Estimating Quaternion Motion," Proc. Am. Control Conf., vol. 3, pp. 2435-2440, June 2003.
[29] D. Kragic and V. Kyrki, "Initialization and System Modeling in 3D Pose Tracking," Proc. Int'l Conf. Pattern Recognition (ICPR), vol. 4, pp. 643-646, 2006.
19 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool