The Community for Technology Leaders
Ninth IEEE International Symposium on Wearable Computers (ISWC'05) (2005)
Osaka, Japan
Oct. 18, 2005 to Oct. 21, 2005
ISBN: 0-7695-2419-2
pp: 92-99
Jason Wither , University of California, Santa Barbara
Tobias H?llerer , University of California, Santa Barbara
<p>This paper presents and evaluates a set of pictorial depth cues for far-field outdoor mobile augmented reality (AR). We examine the problem of accurately placing virtual annotations at physical target points from a static point of view. While it is easy to line up annotations with a target point's projection in the view plane, finding the correct distance for the annotation is difficult if the target point is not represented in an environment model. We have found that AR depth cues, such as vertical and horizontal shadow planes, a small top-down map, or color encodings of relative depth, have a positive impact on a user's ability to align a 3D cursor with physical objects at various distances. These cues aid the user's depth perception and estimation by providing information about the 3D cursor's distance and its relationship in 3- space to any features that may already have been annotated. We conducted a user study that measures the effects of different depth cues for both absolute 3D cursor placement as well as placement relative to a small number of marked reference points, whose distances are known. Our study provides insight about mobile AR users' ability to judge distances both absolutely and relatively, and we identify techniques that successfully enhance their performance.</p>

J. Wither and T. H?llerer, "Pictorial Depth Cues for Outdoor Augmented Reality," Ninth IEEE International Symposium on Wearable Computers (ISWC'05)(ISWC), Osaka, Japan, 2005, pp. 92-99.
97 ms
(Ver 3.3 (11022016))