The Community for Technology Leaders
RSS Icon
Issue No.01 - January/February (2010 vol.16)
pp: 17-27
Frank Steinicke , University of Münster, Münster
Gerd Bruder , University of Münster, Münster
Jason Jerald , University of North Carolina at Chapel Hill, Chapel Hill
Harald Frenz , University of Münster, Münster
Markus Lappe , University of Münster, Münster
In immersive virtual environments (IVEs), users can control their virtual viewpoint by moving their tracked head and walking through the real world. Usually, movements in the real world are mapped one-to-one to virtual camera motions. With redirection techniques, the virtual camera is manipulated by applying gains to user motion so that the virtual world moves differently than the real world. Thus, users can walk through large-scale IVEs while physically remaining in a reasonably small workspace. In psychophysical experiments with a two-alternative forced-choice task, we have quantified how much humans can unknowingly be redirected on physical paths that are different from the visually perceived paths. We tested 12 subjects in three different experiments: (E1) discrimination between virtual and physical rotations, (E2) discrimination between virtual and physical straightforward movements, and (E3) discrimination of path curvature. In experiment E1, subjects performed rotations with different gains, and then had to choose whether the visually perceived rotation was smaller or greater than the physical rotation. In experiment E2, subjects chose whether the physical walk was shorter or longer than the visually perceived scaled travel distance. In experiment E3, subjects estimate the path curvature when walking a curved path in the real world while the visual display shows a straight path in the virtual world. Our results show that users can be turned physically about 49 percent more or 20 percent less than the perceived virtual rotation, distances can be downscaled by 14 percent and upscaled by 26 percent, and users can be redirected on a circular arc with a radius greater than 22 m while they believe that they are walking straight.
Virtual reality, virtual locomotion, redirected walking.
Frank Steinicke, Gerd Bruder, Jason Jerald, Harald Frenz, Markus Lappe, "Estimation of Detection Thresholds for Redirected Walking Techniques", IEEE Transactions on Visualization & Computer Graphics, vol.16, no. 1, pp. 17-27, January/February 2010, doi:10.1109/TVCG.2009.62
[1] T. Banton, J. Stefanucci, F. Durgin, A. Fass, and D. Proffitt, “The Perception of Walking Speed in a Virtual Environment,” Presence, vol. 14, no. 4, pp. 394-406, 2005.
[2] A. Berthoz, The Brain's Sense of Movement. Harvard Univ. Press, 2000.
[3] A. Berthoz, B. Pavard, and L.R. Young, “Perception of Linear Horizontal Self-Motion Induced by Peripheral Vision (Linearvection),” Basic Characteristics and Visual-Vestibular Interactions, vol. 23, pp. 471-489, 1975.
[4] R.J. Bertin, I. Israël, and M. Lappe, “Perception of 2D, Simulated Ego-Motion Trajectories from Optic Flow,” Vision Research, vol. 40, no. 21, pp. 2951-2971, 2000.
[5] L. Bouguila and M. Sato, “Virtual Locomotion System for Large-Scale Virtual Environment,” Proc. IEEE Virtual Reality Conf., pp.291-292, 2002.
[6] L. Bouguila, M. Sato, S. Hasegawa, H. Naoki, N. Matsumoto, A. Toyama, J. Ezzine, and D. Maghrebi, “A New Step-in-Place Locomotion Interface for Virtual Environment with Large Display System,” Proc. ACM SIGGRAPH, p. 63, 2002.
[7] B. Bridgeman, A.H.C. van der Heijden, and B.M. Velichkovsky, “A Theory of Visual Stability across Saccadic Eye Movements,” Behavioral and Brain Sciences, vol. 17, pp. 247-292, 1994.
[8] G. Burdea and P. Coiffet, Virtual Reality Technology. Wiley-IEEE Press, 2003.
[9] E. Burns, S. Razzaque, A.T. Panter, M. Whitton, M. McCallus, and F. Brooks, “The Hand Is Slower than the Eye: A Quantitative Exploration of Visual Dominance over Proprioception,” Proc. IEEE Virtual Reality Conf., pp. 3-10, 2005.
[10] J. Dichgans and T. Brandt, “Visual Vestibular Interaction: Effects on Self-Motion Perception and Postural Control,” Perception. Handbook of Sensory Physiology, R. Held, H.W. Leibowitz, and H.L. Teuber, eds, vol. 8, pp. 755-804, Springer, 1978.
[11] D. Engel, C. Curio, L. Tcheang, B. Mohler, and H.H. Bülthoff, “A Psychophysically Calibrated Controller for Navigating through Large Environments in a Limited Free-Walking Space,” Proc. ACM Symp. Virtual Reality Software and Technology (VRST), pp. 157-164, 2008.
[12] J. Feasel, M. Whitton, and J. Wendt, “LLCM-WIP: Low-Latency, Continuous-Motion Walking-in-Place,” Proc. Symp. 3D User Interfaces 2008, pp. 97-104, 2008.
[13] H. Frenz, M. Lappe, M. Kolesnik, and T. Bührmann, “Estimation of Travel Distance from Visual Motion in Virtual Environments,” ACM Trans. Applied Perception, vol. 3, no. 4, pp. 419-428, 2007.
[14] H. Groenda, F. Nowak, P. Rößler, and U.D. Hanebeck, “Telepresence Techniques for Controlling Avatar Motion in First Person Games,” Proc. Int'l Conf. Intelligent Technologies for Interactive Entertainment (INTETAIN '05), pp. 44-53, 2005.
[15] V. Interrante, L. Anderson, and B. Ries, “Distance Perception in Immersive Virtual Environments, Revisited,” Proc. IEEE Virtual Reality Conf., pp. 3-10, 2006.
[16] V. Interrante, B. Ries, J. Lindquist, and L. Anderson, “Elucidating the Factors That Can Facilitate Veridical Spatial Perception in Immersive Virtual Environments,” Proc. IEEE Virtual Reality Conf., pp. 11-18, 2007.
[17] V. Interrante, B. Riesand, and L. Anderson, “Seven League Boots: A New Metaphor for Augmented Locomotion through Moderately Large Scale Immersive Virtual Environments,” Proc. IEEE Symp. 3D User Interfaces, pp. 167-170, 2007.
[18] H. Iwata, Y. Hiroaki, and H. Tomioka, “Powered Shoes,” Proc. ACM SIGGRAPH, vol. 28, 2006.
[19] H. Iwata, H. Yano, H. Fukushima, and H. Noma, “CirculaFloor,” IEEE Computer Graphics and Applications, vol. 25, no. 1, pp. 64-67, Jan./Feb. 2005.
[20] P.M. Jaekl, R.S. Allison, L.R. Harris, U.T. Jasiobedzka, H.L. Jenkin, M.R. Jenkin, J.E. Zacher, and D.C. Zikovitz, “Perceptual Stability during Head Movement in Virtual Reality,” Proc. Int'l Conf. Virtual Reality, pp. 149-155, 2002.
[21] J. Jerald, T. Peck, F. Steinicke, and M. Whitton, “Sensitivity to Scene Motion for Phases of Head Yaws,” Proc. Conf. Applied Perception in Graphics and Visualization, pp. 155-162, 2008.
[22] L. Kohli, E. Burns, D. Miller, and H. Fuchs, “Combining Passive Haptics with Redirected Walking,” Proc. Int'l Conf. Augmented Tele-Existence, vol. 157, pp. 253-254, 2005.
[23] M. Lappe, F. Bremmer, and A.V. van den Berg, “Perception of Self-Motion from Visual Flow,” Trends in Cognitive Sciences, vol. 3, no. 9, pp. 329-336, 1999.
[24] M. Lappe, M. Jenkin, and L.R. Harris, “Travel Distance Estimation from Visual Motion by Leaky Path Integration,” Experimental Brain Research, vol. 180, pp. 35-48, 2007.
[25] J.M. Loomis and J.M. Knapp, “Visual Perception of Egocentric Distance in Real and Virtual Environments,” Virtual and Adaptive Environments, L.J. Hettinger and M.W. Haas, eds., Lawrence Erlbaum Assoc., 2003.
[26] N. Nitzsche, U. Hanebeck, and G. Schmidt, “Motion Compression for Telepresent Walking in Large Target Environments,” Presence, vol. 13, pp. 44-60, 2004.
[27] T. Peck, M. Whitton, and H. Fuchs, “Evaluation of Reorientation Techniques for Walking in Large Virtual Environments,” Proc. IEEE Virtual Reality Conf., pp. 121-128, 2008.
[28] S. Razzaque, “Redirected Walking,” PhD thesis, Univ. of North Carolina, 2005.
[29] B. Riecke and J. Wiener, “Can People Not Tell Left from Right in VR? Point-to-Origin Studies Revealed Qualitative Errors in Visual Path Integration,” Proc. IEEE Virtual Reality Conf., pp. 3-10, 2007.
[30] M. Schwaiger, T. Thümmel, and H. Ulbrich, “Cyberwalk: Implementation of a Ball Bearing Platform for Humans,” Proc. Int'l Conf. Human-Computer Interaction (HCI), pp. 926-935, 2007.
[31] F. Steinicke, G. Bruder, J. Jerald, H. Frenz, and M. Lappe, “Analyses of Human Sensitivity to Redirected Walking,” Proc. 15th ACM Symp. Virtual Reality Software and Technology, pp. 149-156, 2008.
[32] F. Steinicke, G. Bruder, L. Kohli, J. Jerald, and K. Hinrichs, “Taxonomy and Implementation of Redirection Techniques for Ubiquitous Passive Haptic Feedback,” Proc. Int'l Conf. Cyberworlds, pp. 217-223, 2008.
[33] F. Steinicke, G. Bruder, T. Ropinski, and K. Hinrichs, “Moving Towards Generally Applicable Redirected Walking,” Proc. Virtual Reality Int'l Conf. (VRIC), pp. 15-24, 2008.
[34] J. Su, “Motion Compression for Telepresence Locomotion,” Presence: Teleoperator in Virtual Environments, vol. 4, no. 16, pp.385-398, 2007.
[35] M. Usoh, K. Arthur, M. Whitton, R. Bastos, A. Steed, M. Slater, and F. Brooks, ”Walking > Walking-in-Place > Flying, in Virtual Environments,” Proc. ACM SIGGRAPH, pp. 359-364, 1999.
[36] H. Wallach, “Perceiving a Stable Environment When One Moves,” Ann. Rev. Psychology, vol. 38, pp. 1-27, 1987.
[37] W.H. WarrenJr., “Visually Controlled Locomotion: 40 Years Later,” Ecological Psychology, vol. 10, pp. 177-219, 1998.
[38] A.H. Wertheim, “Motion Perception during Self-Motion, the Direct versus Inferential Controversy Revisited,” Behavioral and Brain Sciences, vol. 17, no. 2, pp. 293-355, 1994.
[39] M. Whitton, J. Cohn, P. Feasel, S. Zimmons, S. Razzaque, B. Poulton, B. McLeod, and F. Brooks, “Comparing VE Locomotion Interfaces,” Proc. IEEE Virtual Reality Conf., pp. 123-130, 2005.
[40] B. Williams, G. Narasimham, T.P. McNamara, T.H. Carr, J.J. Rieser, and B. Bodenheimer, “Updating Orientation in Large Virtual Environments Using Scaled Translational Gain,” Proc. Third Symp. Applied Perception in Graphics and Visualization, vol. 153, pp. 21-28, 2006.
4 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool