Orlando, Florida
Mar. 24, 2002 to Mar. 28, 2002
ISBN: 0-7695-1492-8
pp: 149
R.S. Allison , York University
L.R. Harris , York University
U.T. Jasiobedzka , York University
H.L. Jenkin , York University
M. R. Jenkin , York University
J.E. Zacher , York University
D.C Zikovitz , York University
Virtual reality displays introduce spatial distortions that are very hard to correct because of the difficulty of precisely modelling the camera from the nodal point of each eye. How significant are these distortions for spatial perception in virtual reality? In this study we used a helmet mounted display and a mechanical head tracker to investigate the tolerance to errors between head motions and the resulting visual display. The relationship between the head movement and the associated updating of the visual display was adjusted by subjects until the image was judged as stable relative to the world. Both rotational and translational movements were tested and the relationship between the movements and the direction of gravity was varied systematically. Typically, for the display to be judged as stable, subjects needed the visual world to be moved in the opposite direction of the head movement by an amount greater than the head movement itself, during both rotational and translational head movements, although a large range of movement was tolerated and judged as appearing stable. These results suggest that it not necessary to model the visual geometry accurately and suggest circumstances when tracker drift can be corrected by jumps in the display which will pass unnoticed by the user.
R.S. Allison, L.R. Harris, U.T. Jasiobedzka, H.L. Jenkin, M. R. Jenkin, J.E. Zacher, D.C Zikovitz, "Perceptual Stability During Head Movement in Virtual Reality", VR, 2002, Virtual Reality Conference, IEEE, Virtual Reality Conference, IEEE 2002, pp. 149, doi:10.1109/VR.2002.996517