This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Multimodal Interaction with a Wearable Augmented Reality System
May/June 2006 (vol. 26 no. 3)
pp. 62-71
Mathias Koelsch, Naval Postgraduate School
Ryan Bane, Microsoft Corporation
Tobias Hoellerer, University of California, Santa Barbara
Matthew Turk, University of California, Santa Barbara
Wearable computers and their novel applications demand more context-specific user interfaces than traditional desktop paradigms can offer. This article describes a multimodal interface and explains how it enhances a mobile user's situational awareness and how it provides new functionality. This mobile, augmented-reality system visualizes otherwise invisible information encountered in urban environments. A versatile filtering tool allows interactive display of occluded infrastructure and of dense data distributions, such as room temperature or wireless network strength, with applications for building maintenance, emergency response, and reconnaissance missions. To control this complex application functionality in the real world, the authors combine multiple input modalities--vision-based hand gesture recognition, a 1D tool, and speech recognition--with three late integration styles to provide intuitive and effective input means. The system is demonstrated in a realistic indoor and outdoor task environment, and preliminary user experiences are described. The authors postulate that novel interaction metaphors must be developed together with user interfaces that are capable of controlling them.

1. R. Bane and T. Höllerer, "Interactive Tools for Virtual X-Feature Ray Vision in Mobile Augmented Reality," Proc. IEEE and ACM Intl. Symp. Mixed and Augmented Reality (ISMAR), IEEE CS Press, 2004, pp. 231–239.
2. S. Feiner et al., "A Touring Machine: Prototyping 3D Mobile Augmented Reality Systems for Exploring the Urban Environment," Proc. 2nd Int'l Symp. Wearable Computers (ISWC), IEEE CS Press, 1997, pp. 74–81.
3. M. Kalkusch et al., "Structured Visual Markers for Indoor Pathfinding," Proc. IEEE Int'l Workshop ARToolKit (ART), IEEE CS Press, 2002.
4. S.L. Oviatt, "Multimodal System Processing in Mobile Environments," Proc. ACM Symp. User Interface Software and Technology (UIST), ACM Press, 2000, pp. 21–30.
5. M. Kölsch and M. Turk, "Fast 2D Hand Tracking with Flocks of Features and Multi-Cue Integration," Proc. IEEE Workshop Real-Time Vision for Human-Computer Interaction, IEEE CS Press, 2004, p. 158.
6. T. Kato, T. Kurata, and K. Sakaue, "VizWear-Active: Towards a Functionally-Distributed Architecture for Real-Time Visual Tracking and Context-Aware UI," Proc. 2nd Int'l Symp. Wearable Computers (ISWC), IEEE CS Press, 2002, pp. 162–163.
7. M. Kölsch, M. Turk, and T. Höllerer, "Vision-Based Interfaces for Mobility," Proc. Int'l Conf. Mobile and Ubiquitous Systems (Mobiquitous), IEEE CS Press, 2004, pp. 86–94.
8. M. Kölsch and M. Turk, "Robust Hand Detection," Proc. IEEE Int'l Conf. Automatic Face and Gesture Recognition, IEEE CS Press, 2004, pp. 614–619.
9. I.S. MacKenzie, "Input Devices and Interaction Techniques for Advanced Computing," Virtual Environments and Advanced Interface Design, W. Barfield and T.A. Furness III, eds., Oxford Univ. Press, 1995, pp. 437–470.
1. S. Feiner et al., "A Touring Machine: Prototyping 3D Mobile Augmented Reality Systems for Exploring the Urban Environment," Proc. Int'l Symp. Wearable Computers (ISWC), IEEE CS Press, 1997, pp. 74–81.
2. B.H. Thomas and W. Piekarski, "Glove Based User Interaction Techniques for Augmented Reality in an Outdoor Environment," Virtual Reality: Research, Development, and Applications, vol. 6, no. 3, 2002, pp. 167–180.
3. N. Sawhney and C. Schmandt, "Speaking and Listening on the Run: Design for Wearable Audio Computing," Proc. 2nd Int'l Symp. Wearable Computers (ISWC), IEEE CS Press, 1998, pp. 108–115.
4. E.A. Bier et al., "Toolglass and Magic Lenses: The See-Through Interface," Proc. Siggraph, vol. 27, ACM Press, 1993, pp. 73–80.
5. D.M. Krum et al., "Speech and Gesture Multimodal Control of a Whole Earth 3D Visualization Environment," Proc. Symp. Data Visualization (VISSYM), Eurographics Assoc., 2002, pp. 195–200.
6. E. Kaiser et al., "Mutual Disambiguation of 3D Multimodal Interaction in Augmented and Virtual Reality," Proc. Int'l Conf. Multimodal Interfaces (ICMI), ACM Press, 2003, pp. 12–19.
7. S.L. Oviatt, "Multimodal System Processing in Mobile Environments," Proc. ACM Symp. User Interface Software and Technology (UIST), ACM Press, 2000, pp. 21–30.

Index Terms:
augmented reality, wearable computing, multimodal interface, hand-gesture recognition, information visualization
Citation:
Mathias Koelsch, Ryan Bane, Tobias Hoellerer, Matthew Turk, "Multimodal Interaction with a Wearable Augmented Reality System," IEEE Computer Graphics and Applications, vol. 26, no. 3, pp. 62-71, May-June 2006, doi:10.1109/MCG.2006.66
Usage of this product signifies your acceptance of the Terms of Use.