The Community for Technology Leaders
Green Image
Issue No. 01 - Jan.-March (2012 vol. 5)
ISSN: 1939-1412
pp: 33-38
Brian T. Gleeson , University of British Columbia, Vancouver
Rebecca L. Koslover , University of Utah, Salt Lake City
William R. Provancher , University of Utah, Salt Lake City
Joshua T. de Bever , University of Utah, Salt Lake City
This paper reports on a series of user experiments evaluating the design of a multimodal test platform capable of rendering visual, audio, vibrotactile, and directional skin-stretch stimuli. The test platform is a handheld, wirelessly controlled device that will facilitate experiments with mobile users in realistic environments. Stimuli rendered by the device are fully characterized, and have little variance in stimulus onset timing. A series of user experiments utilizing navigational cues validates the function of the device and investigates the user response to all stimulus modes. Results show users are capable of interpreting all stimuli with high accuracy and can use the direction cues for mobile navigation. Tests included both stationary (seated) and mobile (walking a simple obstacle course) tasks. Accuracy and response time patterns are similar in both seated and mobile conditions. This device provides a means of designing and evaluating multimodal communication methods for handheld devices and will facilitate experiments investigating the effects of stimulus mode on device usability and situation awareness.
Consumer products, haptic device design, haptic I/O, real-time and embedded systems, multimodal feedback.
Brian T. Gleeson, Rebecca L. Koslover, William R. Provancher, Joshua T. de Bever, "Mobile Navigation Using Haptic, Audio, and Visual Direction Cues with a Handheld Test Platform", IEEE Transactions on Haptics, vol. 5, no. , pp. 33-38, Jan.-March 2012, doi:10.1109/TOH.2011.58
103 ms
(Ver 3.1 (10032016))