The Community for Technology Leaders
Green Image
Issue No. 03 - May/June (2006 vol. 26)
ISSN: 0272-1716
pp: 62-71
Matthew Turk , University of California, Santa Barbara
Mathias Koelsch , Naval Postgraduate School
Tobias Hoellerer , University of California, Santa Barbara
Ryan Bane , Microsoft Corporation
ABSTRACT
Wearable computers and their novel applications demand more context-specific user interfaces than traditional desktop paradigms can offer. This article describes a multimodal interface and explains how it enhances a mobile user's situational awareness and how it provides new functionality. This mobile, augmented-reality system visualizes otherwise invisible information encountered in urban environments. A versatile filtering tool allows interactive display of occluded infrastructure and of dense data distributions, such as room temperature or wireless network strength, with applications for building maintenance, emergency response, and reconnaissance missions. To control this complex application functionality in the real world, the authors combine multiple input modalities--vision-based hand gesture recognition, a 1D tool, and speech recognition--with three late integration styles to provide intuitive and effective input means. The system is demonstrated in a realistic indoor and outdoor task environment, and preliminary user experiences are described. The authors postulate that novel interaction metaphors must be developed together with user interfaces that are capable of controlling them.
INDEX TERMS
augmented reality, wearable computing, multimodal interface, hand-gesture recognition, information visualization
CITATION
Matthew Turk, Mathias Koelsch, Tobias Hoellerer, Ryan Bane, "Multimodal Interaction with a Wearable Augmented Reality System", IEEE Computer Graphics and Applications, vol. 26, no. , pp. 62-71, May/June 2006, doi:10.1109/MCG.2006.66
100 ms
(Ver )