The Community for Technology Leaders
RSS Icon
Issue No.01 - Jan.-March (2012 vol.5)
pp: 33-38
Rebecca L. Koslover , University of Utah, Salt Lake City
Brian T. Gleeson , University of British Columbia, Vancouver
Joshua T. de Bever , University of Utah, Salt Lake City
William R. Provancher , University of Utah, Salt Lake City
This paper reports on a series of user experiments evaluating the design of a multimodal test platform capable of rendering visual, audio, vibrotactile, and directional skin-stretch stimuli. The test platform is a handheld, wirelessly controlled device that will facilitate experiments with mobile users in realistic environments. Stimuli rendered by the device are fully characterized, and have little variance in stimulus onset timing. A series of user experiments utilizing navigational cues validates the function of the device and investigates the user response to all stimulus modes. Results show users are capable of interpreting all stimuli with high accuracy and can use the direction cues for mobile navigation. Tests included both stationary (seated) and mobile (walking a simple obstacle course) tasks. Accuracy and response time patterns are similar in both seated and mobile conditions. This device provides a means of designing and evaluating multimodal communication methods for handheld devices and will facilitate experiments investigating the effects of stimulus mode on device usability and situation awareness.
Consumer products, haptic device design, haptic I/O, real-time and embedded systems, multimodal feedback.
Rebecca L. Koslover, Brian T. Gleeson, Joshua T. de Bever, William R. Provancher, "Mobile Navigation Using Haptic, Audio, and Visual Direction Cues with a Handheld Test Platform", IEEE Transactions on Haptics, vol.5, no. 1, pp. 33-38, Jan.-March 2012, doi:10.1109/TOH.2011.58
[1] C.D. Wickens and J.G. Hollands, Engineering Psychology and Human Performance, third ed. Prentice-Hall, 2000.
[2] R. Mohebbi, R. Gray, and H.Z. Tan, "Driver Reaction Time to Tactile and Auditory Read-End Collision Warnings while Talking on a Cell Phone," Human Factors, vol. 51, pp. 102-110, 2009.
[3] T.K. Ferris and N.B. Sarter, "Cross-Modal Links among Vision, Audition, and Touch in Complex Environments," Human Factors, vol. 50, pp. 17-26, 2008.
[4] J.L. Burke, M.S. Prewett, A.A. Gray, L. Yang, F.R.B. Stilson, M.D. Coovert, L.R. Elliot, and E. Redden, "Comparing the Effects of Visual-Auditory and Visual-Tactile Feedback on User Performance: A Meta-Analysis," Proc. Int'l Conf. Multimodal Interfaces, 2006.
[5] B. Mayerhofer, B. Pressl, and M. Wieser, "ODILIA—A Mobility Concept for the Visually Impaired," Proc. 11th Int'l Conf. Computers Helping People with Special Needs, pp. 1109-1116, 2008.
[6] K.-U. Kyung, J.-Y. Lee, and J. Park, "Haptic Stylus and Empirical Studies on Braille, Button, and Texture Display," J. Biomedicine and Biotechnology, vol. 2008, pp. 1-11, 2008.
[7] I. Poupyrev and S. Maruyama, "Tactile Interfaces for Small Touch Screens," Proc. ACM Symp. User Interface Software and Technology (UIST), 2003.
[8] R. Ballagas, M. Ringel, M. Stone, and J. Borchers, "iStuff: A Physical User Interface Toolkit for Ubiquitous Computing Environments," Proc. SIGCHI Conf. Human Factors in Computing Systems, 2003.
[9] W. Heuten, N. Henze, S. Boll, and M. Pielot, "Tactile Wayfinder: A Non-Visual Support System for Wayfinding," Proc. Fifth Nordic Conf. HCI, 2008.
[10] J.B.F. Van Erp, H.A.H.C. Van Veen, C. Jansen, and T. Dobbins, "Waypoint Navigation with a Vibrotactile Waist Belt," ACM Trans. Applied Perception, vol. 2, pp. 106-117, 2005.
[11] J. Chiasson, B.J. McGrath, and A.H. Rupert, "Enhanced Situation Awareness in Sea, Air and Land Environments," Proc. NATO Symp. Spatial Disorientation in Military Vehicles: Causes, Consequences and Cures, 2003.
[12] L.R. Elliott, J. van Erp, E.S. Redden, and M. Duistermaat, "Field-Based Validation of a Tactile Navigation Device," IEEE Trans. Haptics, vol. 3, no. 2, pp. 78-87, Apr.-June 2010.
[13] B.T. Gleeson, S.K. Horschel, and W.R. Provancher, "Perception of Direction for Applied Tangential Skin Displacement: Effects of Speed, Displacement and Repetition," IEEE Trans. Haptics, vol. 3, no. 3, pp. 177-188, July-Sept. 2010.
[14] B.T. Gleeson, S.K. Horschel, and W.R. Provancher, "Design of a Fingertip-Mounted Tactile Display with Tangential Skin Displacement Feedback," IEEE Trans. Haptics, vol. 3, no. 4, pp. 297-301, Oct.-Dec. 2010.
[15] B.T. Gleeson and W.R. Provancher, "Improved Tactile Shear Feedback: Tactor Design and an Aperture-Based Restraint," IEEE Trans. Haptics, vol. 4, no. 4, pp. 253-262, Oct.-Dec. 2011.
[16] E.P. Gardner and B.F. Sklar, "Discrimination of the Direction of Motion on the Human Hand: A Psychophysical Study of Stimulation Parameters," J. Neurophysiology, vol. 71, pp. 2414-2429, 1994.
[17] T. Ooka and K. Fujita, "Virtual Object Manipulation System with Substitutive Display of Tangential Force and Slip by Control of Vibrotactile Phantom Sensation," Proc. IEEE Haptics Symp., 2010.
[18] Q. Wang and V. Hayward, "Biomechanically Optimized Distributed Tactile Transducer Based on Lateral Skin Deformation," The Int'l J. Robotics Research, vol. 29, pp. 323-335, 2010.
[19] R.J. WebsterIII, T.E. Murphy, L.N. Verner, and A.M. Okamura, "A Novel Two-Dimensional Tactile Slip Display: Design, Kinematics and Perceptual Experiments," ACM Trans. Applied Perception, vol. 2, pp. 150-165, 2005.
[20] K.N. Winfree, J. Gewirtz, T. Mather, J. Fiene, and K.J. Kuchenbecker, "A High Fidelity Ungrounded Torque Feedback Device: The iTorqU 2.0," Proc. Third Joint EuroHaptics Conf. and Symp. Haptic Interfaces for Virtual Environment and Teleoperator Systems (World Haptics '09), pp. 261-266, 2009.
[21] D.H. Brainard, "The Psychophysics Toolbox," Spatial Vision, vol. 10, pp. 433-436, 1997.
[22] J.H. Hogema, S.C.D. Vries, J.B.F.V. Erp, and R.J. Kiefer, "A Tactile Seat for Direction Coding in Car Driving: Field Evaluation," IEEE Trans. Haptics, vol. 2, no. 4, pp. 181-188, Oct.-Dec. 2009.
[23] S. Monsell, "Task Switching," Trends in Cognitive Sciences, vol. 7, pp. 134-140, 2003.
[24] M.H. Giard and F. Peronnet, "Auditory-Visual Integration during Multimodal Object Recognition in Humans: A Behavioral and Electrophysiological Studay," J. Cognitive Neuroscience, vol. 11, pp. 473-490, 2006.
19 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool