The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.03 - May/June (2006 vol.26)
pp: 62-71
Mathias Koelsch , Naval Postgraduate School
Ryan Bane , Microsoft Corporation
Tobias Hoellerer , University of California, Santa Barbara
Matthew Turk , University of California, Santa Barbara
ABSTRACT
Wearable computers and their novel applications demand more context-specific user interfaces than traditional desktop paradigms can offer. This article describes a multimodal interface and explains how it enhances a mobile user's situational awareness and how it provides new functionality. This mobile, augmented-reality system visualizes otherwise invisible information encountered in urban environments. A versatile filtering tool allows interactive display of occluded infrastructure and of dense data distributions, such as room temperature or wireless network strength, with applications for building maintenance, emergency response, and reconnaissance missions. To control this complex application functionality in the real world, the authors combine multiple input modalities--vision-based hand gesture recognition, a 1D tool, and speech recognition--with three late integration styles to provide intuitive and effective input means. The system is demonstrated in a realistic indoor and outdoor task environment, and preliminary user experiences are described. The authors postulate that novel interaction metaphors must be developed together with user interfaces that are capable of controlling them.
INDEX TERMS
augmented reality, wearable computing, multimodal interface, hand-gesture recognition, information visualization
CITATION
Mathias Koelsch, Ryan Bane, Tobias Hoellerer, Matthew Turk, "Multimodal Interaction with a Wearable Augmented Reality System", IEEE Computer Graphics and Applications, vol.26, no. 3, pp. 62-71, May/June 2006, doi:10.1109/MCG.2006.66
REFERENCES
1. R. Bane and T. Höllerer, "Interactive Tools for Virtual X-Feature Ray Vision in Mobile Augmented Reality," Proc. IEEE and ACM Intl. Symp. Mixed and Augmented Reality (ISMAR), IEEE CS Press, 2004, pp. 231–239.
2. S. Feiner et al., "A Touring Machine: Prototyping 3D Mobile Augmented Reality Systems for Exploring the Urban Environment," Proc. 2nd Int'l Symp. Wearable Computers (ISWC), IEEE CS Press, 1997, pp. 74–81.
3. M. Kalkusch et al., "Structured Visual Markers for Indoor Pathfinding," Proc. IEEE Int'l Workshop ARToolKit (ART), IEEE CS Press, 2002.
4. S.L. Oviatt, "Multimodal System Processing in Mobile Environments," Proc. ACM Symp. User Interface Software and Technology (UIST), ACM Press, 2000, pp. 21–30.
5. M. Kölsch and M. Turk, "Fast 2D Hand Tracking with Flocks of Features and Multi-Cue Integration," Proc. IEEE Workshop Real-Time Vision for Human-Computer Interaction, IEEE CS Press, 2004, p. 158.
6. T. Kato, T. Kurata, and K. Sakaue, "VizWear-Active: Towards a Functionally-Distributed Architecture for Real-Time Visual Tracking and Context-Aware UI," Proc. 2nd Int'l Symp. Wearable Computers (ISWC), IEEE CS Press, 2002, pp. 162–163.
7. M. Kölsch, M. Turk, and T. Höllerer, "Vision-Based Interfaces for Mobility," Proc. Int'l Conf. Mobile and Ubiquitous Systems (Mobiquitous), IEEE CS Press, 2004, pp. 86–94.
8. M. Kölsch and M. Turk, "Robust Hand Detection," Proc. IEEE Int'l Conf. Automatic Face and Gesture Recognition, IEEE CS Press, 2004, pp. 614–619.
9. I.S. MacKenzie, "Input Devices and Interaction Techniques for Advanced Computing," Virtual Environments and Advanced Interface Design, W. Barfield and T.A. Furness III, eds., Oxford Univ. Press, 1995, pp. 437–470.
17 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool