The Community for Technology Leaders
RSS Icon
Issue No.02 - April-June (2009 vol.2)
pp: 61-72
Thomas Pietrzak , University Paul Verlaine - Metz, Metz
Andrew Crossan , Glasgow University, Glasgow
Stephen A. Brewster , University of Glasgow, Glasgow
Benoît Martin , University Paul Verlaine - Metz, Metz
Isabelle Pecci , University Paul Verlaine - Metz, Metz
Spatial information can be difficult to present to a visually impaired computer user. In this paper, we examine a new kind of tactile cuing for nonvisual interaction as a potential solution, building on earlier work on vibrotactile Tactons. However, unlike vibrotactile Tactons, we use a pin array to stimulate the finger tip. Here, we describe how to design static and dynamic Tactons by defining their basic components. We then present user tests examining how easy it is to distinguish between different forms of pin array Tactons demonstrating accurate Tacton sets to represent directions. These experiments demonstrate usable patterns for static, wave, and blinking pin array Tacton sets for guiding a user in one of eight directions. A study is then described that shows the benefits of structuring Tactons to convey information through multiple parameters of the signal. By using multiple independent parameters for a Tacton, this study demonstrates that participants perceive more information through a single Tacton. Two applications using these Tactons are then presented: a maze exploration application and an electric circuit exploration application designed for use by and tested with visually impaired users.
User interfaces, human factors, haptic I/O, computer-assisted instruction.
Thomas Pietrzak, Andrew Crossan, Stephen A. Brewster, Benoît Martin, Isabelle Pecci, "Creating Usable Pin Array Tactons for Nonvisual Information", IEEE Transactions on Haptics, vol.2, no. 2, pp. 61-72, April-June 2009, doi:10.1109/TOH.2009.6
[1] J.C. Bliss et al. “Optical-to-Tactile Image Conversion for the Blind,” IEEE Trans. Man-Machine Systems, vol. 11, no. 1, pp.58-65, Jan.-Mar. 1970.
[2] S.A. Brewster, Providing a Structured Method for Integrating Non-Speech Audio into Human-Computer Interfaces. Univ. of York, 1994.
[3] L.M. Brown, S.A. Brewster, and H.C. Purchase, “A First Investigation into the Effectiveness of Tactons,” Proc. WorldHaptics Conf., 2005.
[4] L.M. Brown, S.A. Brewster, and H.C. Purchase, “Multidimensional Tactons for Non-Visual Infomation Display in Mobile Devices,” Proc. Int'l Conf. Human Computer Interaction with Mobile Devices and Services (MobileHCI), 2006.
[5] B.P. Challis and A.D.N. Edwards, “Design Principle for Tactile Interaction,” Haptic Human-Computer Interaction, pp. 17-24, Springer, 2001.
[6] A. Chang and C. O'Sullivan, “Audio-Haptic Feedback in Mobile Phones,” Proc. Conf. Human Factors in Computing Systems (CHI), 2005.
[7] A. Crossan and S. Brewster, “Two-Handed Navigation in a Haptic Virtual Environment,” Proc. Extended Conf. Human Factors in Computing Systems (CHI), 2006.
[8] E. Hoggan and S.A. Brewster, “Crossmodal Spatial Location: Initial Experiments,” Proc. Fourth Nordic Conf. Human-Computer Interaction (NordiCHI), 2006.
[9] G. Jansson and P. Pedersen, “Obtaining Geographical Information from a Virtual Map with a Haptic Mouse,” Proc. Int'l Cartographic Conf., Theme “Maps for Blind and Visually Impaired,” 2005.
[10] M. Kurze, “TDraw: A Computer-Based Tactile Drawing Tool for Blind People,” Proc. Int'l ACM Conf. Assistive Technologies, 1996.
[11] K. MacLean and M. Enriquez, “Perceptual Design of Haptic Icons,” Proc. EuroHaptics Conf., 2003.
[12] C. Megard, S. Roselier, and J.M. Burkhardt, “Verbal Association To Tactile Patterns: A Step Towards Textured Legends In Multimodal Maps,” Proc. Extended Abstracts of ACM Conf. Human Factors in Computing Systems (CHI), 2008.
[13] G.A. Miller “The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information,” Psychological Rev., vol. 63, pp. 81-97, 1956.
[14] C. O'Sullivan and A. Chang, “An Activity Classification for Vibrotactile Phenomena,” Proc. Int'l Workshop Haptics and Auditory Interaction Design, 2006.
[15] T. Pietrzak et al. “The MICOLE Architecture: Multimodal Support for Inclusion of Visually Impaired Children,” Proc. Int'l Conf. Multimodal Interfaces (ICMI), 2007.
[16] T. Pietrzak, I. Pecci, and B. Martin, “Static and Dynamic Tactile Directional Cues Experiments with VTPlayer Mouse,” Proc. EuroHaptics Conf., 2006.
[17] H.A.H.C. van Veen and J.B.F. van Erp, “Tactile Information Presentation in the Cockpit,” Haptic Human-Computer Interaction, pp. 50-53, Springer, 2001.
[18] S. Wall and S. Brewster, “Feeling What You Hear: Tactile Feedback for Navigation of Audio Graphs,” Proc. Conf. Human Factors in Computing Systems (CHI), 2006.
3 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool