The Community for Technology Leaders
Green Image
<p><b>Abstract</b>—A fundamental problem in optical, see-through augmented reality (AR) is characterizing how it affects the perception of spatial layout and depth. This problem is important because AR system developers need to both place graphics in arbitrary spatial relationships with real-world objects, and to know that users will perceive them in the same relationships. Furthermore, AR makes possible enhanced perceptual techniques that have no real-world equivalent, such as <it>x-ray vision</it>, where AR users are supposed to perceive graphics as being located behind opaque surfaces. This paper reviews and discusses protocols for measuring egocentric depth judgments in both virtual and augmented environments, and discusses the well-known problem of depth underestimation in virtual environments. It then describes two experiments that measured egocentric depth judgments in AR. Experiment I used a perceptual matching protocol to measure AR depth judgments at medium and far-field distances of 5 to 45 meters. The experiment studied the effects of upper versus lower visual field location, the x-ray vision condition, and practice on the task. The experimental findings include evidence for a switch in bias, from underestimating to overestimating the distance of AR-presented graphics, at <tmath>\sim</tmath>23 meters, as well as a quantification of how much more difficult the x-ray vision condition makes the task. Experiment II used blind walking and verbal report protocols to measure AR depth judgments at distances of 3 to 7 meters. The experiment examined real-world objects, real-world objects seen through the AR display, virtual objects, and combined real and virtual objects. The results give evidence that the egocentric depth of AR objects is underestimated at these distances, but to a lesser degree than has previously been found for most virtual reality environments. The results are consistent with previous studies that have implicated a restricted field-of-view, combined with an inability for observers to scan the ground plane in a near-to-far direction, as explanations for the observed depth underestimation.</p>
Artificial, augmented, and virtual realities, ergonomics, evaluation/methodology, screen design, experimentation, measurement, performance, depth perception, optical see-through augmented reality.

J. E. Swan II, E. Kolstad, H. S. Smallman, A. Jones and M. A. Livingston, "Egocentric Depth Judgments in Optical, See-Through Augmented Reality," in IEEE Transactions on Visualization & Computer Graphics, vol. 13, no. , pp. 429-442, 2007.
167 ms
(Ver 3.3 (11022016))