The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.04 - April (2012 vol.18)
pp: 581-588
B. Pollock , Comput. Eng. Dept., Iowa State Univ., Ames, IA, USA
M. Burton , Human Comput. Interaction Program & Virtual Reality Applic. Center, Iowa State Univ., Ames, IA, USA
J. W. Kelly , Dept. of Psychol., Iowa State Univ., Ames, IA, USA
S. Gilbert , Human Comput. Interaction Program & Virtual Reality Applic. Center, Iowa State Univ., Ames, IA, USA
E. Winer , Mech. Eng. Dept., Iowa State Univ., Ames, IA, USA
ABSTRACT
Stereoscopic depth cues improve depth perception and increase immersion within virtual environments (VEs). However, improper display of these cues can distort perceived distances and directions. Consider a multi-user VE, where all users view identical stereoscopic images regardless of physical location. In this scenario, cues are typically customized for one "leader" equipped with a head-tracking device. This user stands at the center of projection (CoP) and all other users ("followers") view the scene from other locations and receive improper depth cues. This paper examines perceived depth distortion when viewing stereoscopic VEs from follower perspectives and the impact of these distortions on collaborative spatial judgments. Pairs of participants made collaborative depth judgments of virtual shapes viewed from the CoP or after displacement forward or backward. Forward and backward displacement caused perceived depth compression and expansion, respectively, with greater compression than expansion. Furthermore, distortion was less than predicted by a ray-intersection model of stereo geometry. Collaboration times were significantly longer when participants stood at different locations compared to the same location, and increased with greater perceived depth discrepancy between the two viewing locations. These findings advance our understanding of spatial distortions in multi-user VEs, and suggest a strategy for reducing distortion.
INDEX TERMS
virtual reality, computer displays, stereo image processing, user interfaces, stereoscopic displays, depth perception, stereoscopic multiuser virtual environment, stereoscopic depth cue, immersion, stereoscopic image, head-tracking device, center-of-projection, follower perspective, leader perspective, collaborative spatial judgment, virtual shape, forward displacement, backward displacement, perceived depth compression, perceived depth expansion, ray-intersection model, stereo geometry, distortion reduction strategy, Virtual environments, Predictive models, Stereo image processing, Shape, Collaboration, Educational institutions, and collaborative interaction., Perception, stereoscopy
CITATION
B. Pollock, M. Burton, J. W. Kelly, S. Gilbert, E. Winer, "The Right View from the Wrong Location: Depth Perception in Stereoscopic Multi-User Virtual Environments", IEEE Transactions on Visualization & Computer Graphics, vol.18, no. 4, pp. 581-588, April 2012, doi:10.1109/TVCG.2012.58
REFERENCES
[1] M. Banks, R. Held, and A. Girshick, "Perception of 3-D Layout in Stereo Displays," Information Display, vol. 25, no. 1, pp. 12-16, 2009.
[2] R. T. Held and M. S. Banks, "Misperceptions in Stereoscopic Displays: A Vision Science Perspective," Proc. of the 5th Symp. on Applied Perception in Graphics and Visualization (APGV'08), ACM, 2008.
[3] A. Woods, T. Docherty, and R. Koch, "Image Distortions in Stereoscopic Video Systems," SPIE Proc. Stereoscopic Displays and Applications IV, vol. 1915, pp. 36-48, 1993.
[4] M. Burton, B. Pollock, J. W. Kelly, S. Gilber, and E. Winer, "Diagnosing perceptual distortion present in group stereoscope viewing," In Proc. of the SPIE, Human Vision and Electronic Imaging XVII, 2012.
[5] D. Vishwanath, A. Girshick, and M. Banks, "Why Pictures Look Right When Viewed from the Wrong Place," Nature Neuroscience, vol. 8, no. 10, pp. 1401-1410, 2005.
[6] M. H. Pirenne, Optics, painting, and photography. Cambridge, England: Cambridge University Press, 1970.
[7] M. Polanyi, What is a painting? British Journal of Aesthetics, vol. 10, pp. 225-236, 1970.
[8] D. W. Eby and M. L. Braunstein, The perceptual flattening of three-dimensional scenes enclosed by a frame. Perception, vol. 24, pp. 981-993, 1995.
[9] J. Loomis and J. Knapp, "Visual Perception of Egocentric Distance in Real and Virtual Environments," Virtual and Adaptive Environments: Applications, Implications, and Human Performance Issues, 2003.
[10] J. Marbach, "Image Blending and View Clustering for Multi-Viewer Immersive Projection Environments," Proc. of IEEE Virtual Reality Conference, 2009.
[11] B. Bodenheimer, J. Meng, H. Wu, G. Narasimham, B. Rump, T. P. McNamara, T. H. Carr, and J. J. Rieser, "Distance Estimation in Virtual and Real Environments Using Bisection," Proc. of the 4th Symp. on Applied Perception in Graphics and Visualization, 2007.
[12] J. M. Knapp and J. M. Loomis, "Limited Field of View of Head-Mounted Displays Is Not the Cause of Distance Underestimation in Virtual Environments," Presence: Teleoperators and Virtual Environments, vol. 13, no. 5, pp. 572-577, 2004.
[13] S. A. Kuhl, W. B. Thompson, and S. H. Creem-Regehr, "HMD Calibration and its Effects on Distance Judgments," ACM Trans. Appl. Percept., vol. 6, no. 3, art. 19, pp. 1-20, 2009.
[14] R. Messing and F. H. Durgin, "Distance Perception and the Visual Horizon in Head-Mounted Displays," ACM Trans. Appl. Percept., vol. 2, no. 3, pp. 234-250, 2005.
[15] F. Steinicke, G. Bruder, B. Ries, K. H. Hinrichs, M. Lappe, and V. Interrante, "Transitional Environments Enhance Distance Perception in Immersive Virtual Reality Systems," Proc. of Symp. on Applied Perception in Graphics and Visualization, 2009.
[16] W. B. Thompson, P. Willemsen, A. A. Gooch, S. H. Creem-Regehr, J. M. Loomis, and A. C. Beall, "Does the Quality of the Computer Graphics Matter when Judging Distances in Visually Immersive Environments?," Presence: Teleoperators and Virtual Environments, vol. 13, no. 5, pp. 560-571, 2004.
[17] A. A. Gooch and P. Willemsen, "Evaluating Space Perception in NPR Immersive Environments," Proc. of the 2nd Int. Symp. on Non-Photorealistic Animation and Rendering (NPAR'02), ACM, 2002.
[18] B. G. Witmer and W. J. Sadowski, "Nonvisually Guided Locomotion to a Previously Viewed Target in Real and Virtual Environments," Human Factors: The Journal of the Human Factors and Ergonomics Society, vol. 50, no. 3, pp. 478-488, 1998.
[19] J. M. Plumert, J. K. Kearney, and J. F. Cremer, "Children's Perception of Gap Affordances: Bicycling Across Traffic-Filled Intersections in an Immersive Virtual Environment," Child Development, vol. 75, no. 4, pp. 1243-1253, 2004.
[20] C. Ziemer, J. Plumert, J. Cremer, and J. Kearney, "Making Distance Judgments in Real and Virtual Environments: Does Order Make a Difference?," Proc. of the 3rd Symp. on Applied Perception in Graphics and Visualization (APGV'06), ACM, 2006.
[21] B. E. Riecke, P. A. Behbahani, and C. D. Shaw, "Display Size Does not Affect Egocentric Distance Perception of Naturalistic Stimuli," Proc. of the 6th Symp. on Applied Perception in Graphics and Visualization (APGV'09), ACM, 2009.
[22] H. H. Clark and C. E. Marshall, Definite reference and mutual knowledge, A. K. Joshi, B. L. Weber, and I. A. Sag (Eds.), Elements , of discourse understanding, Cambridge University Press, Cambridge, UK, pp. 10-63, 1981.
[23] H. H. Clark and D. Wilkes-Gibbs, "Referring as a collaborative process," Cognition, vol. 22, no. 1, pp. 1-39, 1986.
[24] J. W. Kelly, A. C. Beall, and J. M. Loomis, "Perception of shared visual space: Establishing common ground in real and virtual environments," Presence: Teleoperators and Virtual Environments, vol. 13, no. 4, pp. 442-450, 2004.
[25] R. E. Kraut, S. R. Fussell, and J. Siegel, "Visual information as a conversational resource in collaborative physical tasks," Human-Computer Interaction, vol. 18, no. 1, pp. 13-49, 2003.
[26] G. Olson and J. Olson, "Distance Matters," Human Computer Interaction, vol. 15, no. 2/3, pp. 139-179, 2000.
[27] C. Cruz-Neira, A. Bierbaum, P. Hartling, C. Just, and K. Meinert, "VR Juggler - An Open Source platform for virtual reality applications" 40th AIAA Aerospace Sciences Meeting & Exhibit, 2002.
[28] G. Keppel and T. D. Wickens, Design and Analysis, A Researcher's Handbook, 4th Edition. Upper Saddle River, NJ: Pearson Prentice Hall, 2004.
[29] J. R. Levin and E. Neumann, "Testing for Predicted Patterns: When Interest in the Whole is Greater Than in Some of its Parts," Psychological Methods, vol. 4, no. 1, pp. 44-57, 1999.
[30] B. Wu, T. L. Ooi, and Z. J. He, "Perceiving distance accurately by a directional process of integrating ground information," Nature, vol. 428, pp. 73-77, 2004.
[31] T. L. Ooi, B. Wu, and Z. J. He, "Distance determined by the angular declination below the horizon," Nature, vol. 414, pp. 197-200, 2001.
[32] J. Campos, P. Freitas, E. Turner, M. Wong, and H.-J. Sun, "The effect of optical magnification/minimization on distance estimation by stationary and walking observers," Journal of Vision, vol. 7, no. 9, 2007.
[33] G. Bruder, F. Steinicke, C. Walter, and M. Moehring, "Evaluation of Field of View Calibration Techniques for Head-mounted Displays," ACM Symposium on Applied Perception in Graphics and Visualization. ACM Press, 2011.
[34] J. M. Hillis, S. J. Watt, M. A. Landy, and M. S. Banks, "Slant from texture and disparity cues: Optimal cue combination. Journal of Vision, 4, 967-992.
[35] D. C. Knill and J. Saunders, "Do humans optimally integrate stereo and texture information for judgments of surface slant?" Vision Research, vol. 43, no. 24, pp. 2539-58, 2003.
50 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool