This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Tuning Self-Motion Perception in Virtual Reality with Visual Illusions
July 2012 (vol. 18 no. 7)
pp. 1068-1078
F. Steinicke, Depts. of Human-Comput. Media & Comput. Sci., Univ. of Wurzburg, Wurzburg, Germany
G. Bruder, Depts. of Human-Comput. Media & Comput. Sci., Univ. of Wurzburg, Wurzburg, Germany
P. Wieland, Inst. of Psychol., Univ. of Munster, Munster, Germany
M. Lappe, Inst. of Psychol., Univ. of Munster, Munster, Germany
Motion perception in immersive virtual environments significantly differs from the real world. For example, previous work has shown that users tend to underestimate travel distances in virtual environments (VEs). As a solution to this problem, researchers proposed to scale the mapped virtual camera motion relative to the tracked real-world movement of a user until real and virtual motion are perceived as equal, i.e., real-world movements could be mapped with a larger gain to the VE in order to compensate for the underestimation. However, introducing discrepancies between real and virtual motion can become a problem, in particular, due to misalignments of both worlds and distorted space cognition. In this paper, we describe a different approach that introduces apparent self-motion illusions by manipulating optic flow fields during movements in VEs. These manipulations can affect self-motion perception in VEs, but omit a quantitative discrepancy between real and virtual motions. In particular, we consider to which regions of the virtual view these apparent self-motion illusions can be applied, i.e., the ground plane or peripheral vision. Therefore, we introduce four illusions and show in experiments that optic flow manipulation can significantly affect users' self-motion judgments. Furthermore, we show that with such manipulations of optic flow fields the underestimation of travel distances can be compensated.

[1] A. Berthoz, The Brain's Sense of Movement. Harvard Univ. Press, 2000.
[2] L.R. Harris, M.R. Jenkin, D.C. Zikovitz, F. Redlick, P.M. Jaekl, U.T. Jasiobedzka, H.L. Jenkin, and R.S. Allison, “Simulating Self Motion I: Cues for the Perception of Motion,” Virtual Reality, vol. 6, no. 2, pp. 75-85, 2002.
[3] M. Lappe, M.R. Jenkin, and L.R. Harris, “Travel Distance Estimation from Visual Motion by Leaky Path Integration,” Experimental Brain Research, vol. 180, pp. 35-48, 2007.
[4] F. Steinicke, G. Bruder, J. Jerald, H. Frenz, and M. Lappe, “Estimation of Detection Thresholds for Redirected Walking Techniques,” IEEE Trans. Visualization and Computer Graphics, vol. 16, no. 1, pp. 17-27, Jan. 2010.
[5] P.M. Jaekl, R.S. Allison, L.R. Harris, U.T. Jasiobedzka, H.L. Jenkin, M.R. Jenkin, J.E. Zacher, and D.C. Zikovitz, “Perceptual Stability during Head Movement in Virtual Reality,” Proc. IEEE Virtual Reality Conf. (VR '02), pp. 149-155, 2002.
[6] J. Gibson, The Perception of the Visual World. Riverside Press, 1950.
[7] E. Suma, S. Clark, S. Finkelstein, and Z. Wartell, “Exploiting Change Blindness to Expand Walkable Space in a Virtual Environment,” Proc. IEEE Virtual Reality Conf. (VR), pp. 305-306, 2010.
[8] F. Steinicke, G. Bruder, K. Hinrichs, and P. Willemsen, “Change Blindness Phenomena for Stereoscopic Projection Systems,” Proc. IEEE Virtual Reality. Conf. (VR), pp. 187-194, 2010.
[9] S. Razzaque, “Redirected Walking,” PhD dissertation, Univ. of North Carolina, Chapel Hill, 2005.
[10] V. Interrante, B. Ries, and L. Anderson, “Seven League Boots: A New Metaphor for Augmented Locomotion Through Moderately Large Scale Immersive Virtual Environments,” Proc. IEEE Symp. 3D User Interfaces, pp. 167-170, 2007.
[11] L. Kohli, E. Burns, D. Miller, and H. Fuchs, “Combining Passive Haptics with Redirected Walking,” Proc. Int'l Conf. Augmented Tele-Existence (ICAT '05), vol. 157, pp. 253-254, 2005.
[12] G. Bruder, F. Steinicke, and K. Hinrichs, “Arch-Explore: A Natural User Interface for Immersive Architectural Walkthroughs,” Proc. IEEE Symp. 3D User Interfaces, pp. 75-82, 2009.
[13] M. Giese, “A Dynamical Model for the Perceptual Organization of Apparent Motion,” PhD dissertation, Ruhr-Univ. Bochum, 1997.
[14] W. Freeman, E. Adelson, and D. Heeger, “Motion Without Movement,” SIGGRAPH Computer Graphics, vol. 25, no. 4, pp. 27-30, 1991.
[15] G. Bruder, F. Steinicke, and P. Wieland, “Self-Motion Illusions in Immersive Virtual Reality Environments,” Proc. IEEE Virtual Reality Conf. (VR), pp. 39-46, 2011.
[16] M. Lappe, F. Bremmer, and A. Van Den Berg, “Perception of Self-Motion from Visual Flow,” Trends in Cognitive Sciences, vol. 3, no. 9, pp. 329-336, 1999.
[17] P. Guerin and B. Bardy, “Optical Modulation of Locomotion and Energy Expenditure at Preferred Transition Speed,” Experimental Brain Research, vol. 189, pp. 393-402, 2008.
[18] S.E. Palmer, Vision Science: Photons to Phenomenology. MIT Press, 1999.
[19] Z. Bian, M.L. Braunstein, and G.J. Andersen, “The Ground Dominance Effect in the Perception of 3D Layout,” Perception & Psychophysics, vol. 67, no. 5, pp. 802-815, 2005.
[20] K. Portin, S. Vanni, V. Virsu, and R. Hari, “Stronger Occipital Cortical Activation to Lower than Upper Visual Field Stimuli Neuromagnetic Recordings,” Experimental Brain Research, vol. 124, pp. 287-294, 1999.
[21] D.S. Marigold, V. Weerdesteyn, A.E. Patla, and J. Duysens, “Keep Looking Ahead? Re-Direction of Visual Fixation Does Not Always Occur during an Unpredictable Obstacle Avoidance Task,” Experimental Brain Research, vol. 176, no. 1, pp. 32-42, 2007.
[22] H. Frenz, M. Lappe, M. Kolesnik, and T. Bührmann, “Estimation of Travel Distance from Visual Motion in Virtual Environments,” ACM Trans. Applied Perception, vol. 3, no. 4, pp. 419-428, 2007.
[23] T. Banton, J. Stefanucci, F. Durgin, A. Fass, and D. Proffitt, “The Perception of Walking Speed in a Virtual Environment,” Presence, vol. 14, no. 4, pp. 394-406, 2005.
[24] Y. Hermush and Y. Yeshurun, “Spatial-Gradient Limit on Perception of Multiple Motion,” Perception, vol. 24, no. 11, pp. 1247-1256, 1995.
[25] E. Adelson and J. Bergen, “Spatiotemporal Energy Models for the Perception of Motion,” J. Optical Soc. Am. A, vol. 2, no. 2, pp. 284-299, 1985.
[26] G. Mather, “Two-Stroke: A New Illusion of Visual Motion Based on the Time Course of Neural Responses in the Human Visual System,” Vision Research, vol. 46, no. 13, pp. 2015-2018, 2006.
[27] G. Mather and L. Murdoch, “Second-Order Processing of Four-Stroke Apparent Motion,” Vision Research, vol. 39, no. 10, pp. 1795-1802, 1999.
[28] C. Duffy and R. Wurtz, “An Illusory Transformation of Optic Flow Fields,” Vision Research, vol. 33, no. 11, pp. 1481-1490, 1993.
[29] S. Anstis and B. Rogers, “Illusory Continuous Motion from Oscillating Positive-Negative Patterns: Implications for Motion Perception,” Perception, vol. 15, pp. 627-640, 1986.
[30] H. Longuet-Higgins and K. Prazdny, “The Interpretation of a Moving Retinal Image,” Proc. Royal Soc. B: Biological Sciences, vol. 208, pp. 385-397, 1980.
[31] L. Antonov and R. Raskar, “Implementation of Motion Without Movement on Real 3D Objects,” Technical Report TR-04-02, Dept. of Computer Science, Virginia Tech nology., 2002.
[32] R.A. Rensink, J.K. O'Regan, and J.J. Clark, “To See or Not to See: The Need for Attention to Perceive Changes in Scenes,” Psychological Science, vol. 8, pp. 368-373, 1997.
[33] S. Domhoefer, P. Unema, and B. Velichkovsky, “Blinks, Blanks and Saccades: How Blind We Really Are for Relevant Visual Events,” Progress in Brain Research, vol. 140, pp. 119-131, 2002.
[34] N.A. Macmillan and C.D. Creelman, Detection Theory: A User's Guide. Lawrence Erlbaum Assoc., 2005.
[35] M. Usoh, E. Catena, S. Arman, and M. Slater, “Using Presence Questionnaires in Reality,” Presence: Teleoperators in Virtual Environments, vol. 9, no. 5, pp. 497-503, 1999.
[36] R.S. Kennedy, N.E. Lane, K.S. Berbaum, and M.G. Lilienthal, “Simulator Sickness Questionnaire: An Enhanced Method for Quantifying Simulator Sickness,” Int'l J. Aviation Psychology, vol. 3, no. 3, pp. 203-220, 1993.
[37] A. Johnston, C. Benton, and P. McOwan, “Induced Motion at Texture-Defined Motion Boundaries,” Proc. Royal Soc. B: Biological Sciences, vol. 266, no. 1436, pp. 2441-2450, 1999.

Index Terms:
visual perception,image sensors,image sequences,motion estimation,virtual reality,peripheral vision,self-motion perception tuning,virtual reality,visual illusions,virtual environments,travel distance underestimation,mapped virtual camera motion,underestimation compensation,distorted space cognition,self-motion illusions,optic flow fields,ground plane vision,Visualization,Optical sensors,Cameras,Optical distortion,Blindness,Stimulated emission,Detectors,optic flow.,Self-motion perception,virtual environments,visual illusions
Citation:
F. Steinicke, G. Bruder, P. Wieland, M. Lappe, "Tuning Self-Motion Perception in Virtual Reality with Visual Illusions," IEEE Transactions on Visualization and Computer Graphics, vol. 18, no. 7, pp. 1068-1078, July 2012, doi:10.1109/TVCG.2011.274
Usage of this product signifies your acceptance of the Terms of Use.