Issue No. 01 - Jan.-March (2012 vol. 5)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TOH.2011.58
Rebecca L. Koslover , University of Utah, Salt Lake City
Brian T. Gleeson , University of British Columbia, Vancouver
Joshua T. de Bever , University of Utah, Salt Lake City
William R. Provancher , University of Utah, Salt Lake City
This paper reports on a series of user experiments evaluating the design of a multimodal test platform capable of rendering visual, audio, vibrotactile, and directional skin-stretch stimuli. The test platform is a handheld, wirelessly controlled device that will facilitate experiments with mobile users in realistic environments. Stimuli rendered by the device are fully characterized, and have little variance in stimulus onset timing. A series of user experiments utilizing navigational cues validates the function of the device and investigates the user response to all stimulus modes. Results show users are capable of interpreting all stimuli with high accuracy and can use the direction cues for mobile navigation. Tests included both stationary (seated) and mobile (walking a simple obstacle course) tasks. Accuracy and response time patterns are similar in both seated and mobile conditions. This device provides a means of designing and evaluating multimodal communication methods for handheld devices and will facilitate experiments investigating the effects of stimulus mode on device usability and situation awareness.
Consumer products, haptic device design, haptic I/O, real-time and embedded systems, multimodal feedback.
B. T. Gleeson, R. L. Koslover, W. R. Provancher and J. T. de Bever, "Mobile Navigation Using Haptic, Audio, and Visual Direction Cues with a Handheld Test Platform," in IEEE Transactions on Haptics, vol. 5, no. , pp. 33-38, 2011.