The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.03 - July-September (2009 vol.2)
pp: 123-135
Karon E. MacLean , University of British Columbia, Vancouver
ABSTRACT
In an attentionally overloaded world, relief will come only from interfaces between humans and computation that are able to provide information in the background of our sensory and cognitive processes. Haptic displays may have a special role to play in this emerging movement toward ambient interfaces, because the touch sense is well suited to present many types of information in a way that treads lightly on our mental resources. This paper offers an introduction to the notion of ambient information display, and explores why and how the haptic channel could contribute. It begins with a discussion of the attentional problems posed by contemporary interface technology, and a broad overview of ambient interfaces themselves: their purpose, specification, features, and some general examples. Sense is made of the haptic ambient design space through a morphology of the functionality and social configurations exhibited by existing and envisioned examples. Finally, reflections on design principles and challenges for ambient haptic interfaces are aimed at inspiring, shaping, and informing future development in this area.
INDEX TERMS
Haptic I/O, human computer interaction (HCI), human information processing, ambient interfaces.
CITATION
Karon E. MacLean, "Putting Haptics into the Ambience", IEEE Transactions on Haptics, vol.2, no. 3, pp. 123-135, July-September 2009, doi:10.1109/TOH.2009.33
REFERENCES
[1] E. Horvitz, C. Kadie, T. Paek, and D. Hovel, “Models of Attention in Computing and Communication: From Principles to Applications,” Comm. ACM, vol. 46, no. 3, pp. 52-59, 2003.
[2] E. Cutrell, M. Czerwinski, and E. Horvitz, “Notification, Disruption and Memory: Effects of Messaging Interruptions on Memory and Performance,” Proc. Conf. Human Computer Interaction (INTERACT '01), M. Hirose, ed., pp. 263-269, 2001.
[3] J. Fogarty, S.E. Hudson, C.G. Atkeson, D. Avrahami, J. Forlizzi, S. Kiesler, J.C. Lee, and J. Yang, “Predicting Human Interruptibility with Sensors,” ACM Trans. Computer-Human Interaction, vol. 12, no. 1, pp. 119-146, 2005.
[4] M. Weiser, “The Computer for the 21st Century,” Scientific Am., vol. 265, no. 3, pp. 94-110, 1991.
[5] M. Weiser and J. Brown, “Designing Calm Technology,” PowergridJ., vol. v1.01, pp. 94-110, 1996.
[6] S. Hudson, J. Fogarty, C. Atkeson, D. Avrahami, J. Forlizzi, S. Kiesler, J. Lee, and J. Yang, “Predicting Human Interruptibility with Sensors: A Wizard of Oz Feasibility Study,” Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI '03), 2003.
[7] A. Oulasvirta, S. Tamminen, V. Roto, and J. Kuorelahti, “Interaction in 4-Second Bursts: The Fragmented Nature of Attentional Resources in Mobile HCI,” Proc. ACM Conf. Human Factors in Computing Systems (CHI '05), CHI Letters, vol. 7, no. 1, pp. 919-928, 2005.
[8] H. Menzies, No Time: Stress and the Crisis of Modern Life. Douglas and McIntyre, 2005.
[9] M. Mazmanian, J. Yates, and W. Orlikowski, “Ubiquitous Email: Individual Experiences and Organizational Consequences of Blackberry Use,” Proc. 65th Ann. Meeting of the Academy of Management, 2006.
[10] R. Strong and B. Gaver, “Feather, Scent, and Shaker: Supporting Simple Intimacy,” Proc. ACM Conf. Computer Supported Cooperative Work (CSCW '96), 1996.
[11] H. Ishii, C. Wisneski, S. Brave, A. Dahley, M. Gorbet, B. Ullmer, and P. Yarin, “ambientROOM: Integrating Ambient Media with Architectural Space,” Proc. CHI 1998 Conf. Summary on Human Factors in Computing Systems, pp. 173-174, 1998.
[12] B. MacIntyre, E.D. Mynatt, S. Voida, K.M. Hansen, J. Tullio, and G.M. Corso, “Support for Multitasking and Background Awareness Using Interactive Peripheral Displays,” Proc. 14th Ann. ACM Symp. User Interface Software and Technology (UIST '01), pp. 41-50, 2001.
[13] D. Hindus, S.D. Mainwaring, N. Leduc, A.E. Hagstrm, and O. Bayley, “Casablanca: Designing Social Communication Devices for the Home,” Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI '01), pp. 325-332, 2001.
[14] A.F. Monk, “Simple, Social, Ethical and Beautiful: Requirements for UIs in the Home,” Proc. Ninth Conf. Australasian User Interface, vol. 76, pp. 3-9, 2008.
[15] J.M. Heiner, S.E. Hudson, and K. Tanaka, “The Information Percolator: Ambient Information Display in a Decorative Object,” Proc. 12th Ann. ACM Symp. User Interface Software and Technology (UIST '99), pp. 141-148, 1999.
[16] P. Wright, J. Wallace, and J. McCarthy, “Aesthetics and Experience-Centred Design,” ACM Trans. Computer-Human Interaction (TOCHI), special issue on aesthetics of interaction, vol. 15, no. 4, pp. 337-346, 2008.
[17] J. Dewey, Art as Experience. Pedigree, 1934.
[18] J. Wallace, D. Jackson, C. Ladha, P. Olivier, A. Monk, M. Blythe, and P. Wright, “Digital Jewellery and Family Relationships,” Proc. Family and Comm. Technologies Workshop (FACT '07), 2007.
[19] P. Olivier and J. Wallace, “Digital Technologies and the Emotional Family,” Int'l J. Human-Computer Studies, vol. 67, no. 2, pp. 204-214, 2009.
[20] Panasonic, The Emo Bracelet, 2007.
[21] B.J. Fogg, Persuasive Technology: Using Computers to Change What We Think and Do. Morgan Kaufmann, 2002.
[22] J.E. Froehlich, T. Dillahunt, P. Klasnja, J. Mankoff, S. Consolvo, B. Harrison, and J.A. Landay, “Ubigreen: Investigating a Mobile Tool for Tracking and Supporting Green Transportation Habits,” Proc. ACM Conf. Human Factors in Computing Systems (CHI '09), pp.1043-1052, 2009.
[23] C. Smith, “Wilting Flower,” http:/www.coroflot.com/, 2008.
[24] A.M. Clarke, “Ambient and Pervasive Technology: Designing Safeguards for Vulnerable Users,” ACM Interactions, vol. 5, pp. 26-28, 2007.
[25] L.E. Holmquist, “Focus + Context Visualization with Flip Zooming and the Zoom Browser,” CHI '97 Extended Abstracts on Human Factors in Computing Systems, pp. 263-264, ACM Press, 1997.
[26] W. Gaver, “Auditory Icons: Using Sound in Computer Interfaces,” Human-Computer Interaction, vol. 2, pp. 167-177, 1986.
[27] R. Axelrod, The Evolution of Cooperation. Basic Books, 1984.
[28] BodyMedia Inc., “Sensewear WMS Components,” http:/www. sensewear.com/, 2009.
[29] HowStuffWorks.com, “How the Nike+Ipod Works,” http:// electronics.howstuffworks.com nike-ipod1.htm, 2008.
[30] Apple Inc., “Nike+Ipod,” http://www.apple.com/ipod/nikerun.html, 2008.
[31] Immersion Corporation, “The Immersion I-Feel and Feel It Mouse, I-Force Game Controllers,” http:/www.immersion. com/, 2000.
[32] Immersion Corporation, “Logitech Wingman Force Feedback Mouse,” http:/www.immersion.com/, 1999.
[33] J. Dennerlein, D. Martin, and C. Hasser, “Force-Feedback Improves Performance for Steering and Combined Steering-Targeting Tasks,” Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI '00), pp. 423-429, 2000.
[34] T.N. Smyth and A.E. Kirkpatrick, “A New Approach to Haptic Augmentation of the GUI,” Proc. Eighth Int'l Conf. Multimodal Interfaces (ICMI '06), pp. 372-379, 2006.
[35] S.S. Snibbe, K.E. MacLean, R. Shaw, J.B. Roderick, W. Verplank, and M. Scheeff, “Haptic Metaphors for Digital Media,” Proc. ACM Symp. User Interface Software and Technology (UIST '01), pp. 199-208, 2001.
[36] A. Hoffman, D. Spelmezan, and J. Borchers, “Typeright: A Keyboard with Tactile Error Prevention,” Proc. ACM Conf. Human Factors in Computing Systems (CHI '09), pp. 2265-2268, 2009.
[37] K.E. MacLean and M. Enriquez, “Perceptual Design of Haptic Icons,” Proc. EuroHaptics Conf., pp. 351-363, 2003.
[38] L.M. Brown, S.A. Brewster, and H.C. Purchase, “Multidimensional Tactons for Non-Visual Information Presentation in Mobile Devices,” Proc. Eighth Conf. Human-Computer Interaction with Mobile Devices and Services, A. Press, ed., pp. 231-238, 2006.
[39] K.E. MacLean, “Foundations of Transparency in Tactile Information Design,” IEEE Trans. Haptics, vol. 1, no. 2, pp. 84-95, July-Dec. 2008.
[40] J. Luk, J. Pasquero, S. Little, K. MacLean, V. Levesque, and V. Hayward, “A Role for Haptics in Mobile Interaction: Initial Design Using a Handheld Tactile Display Prototype,” Proc. ACM Conf. Human Factors in Computing Systems (CHI '06), vol. 8, no. 1, pp.171-180, 2006.
[41] R. Leung, K.E. MacLean, M.B. Bertelsen, and M. Saubhasik, “Evaluation of Haptically Augmented Touchscreen gui Elements under Cognitive Load,” Proc. Ninth Int'l Conf. Multimodal Interfaces (ICMI '07), pp. 374-381, 2007.
[42] I. Poupyrev, S. Maruyama, and J. Rekimoto, “Ambient Touch: Designing Tactile Interfaces for Handheld Devices,” Proc. 15th Ann. ACM Symp. User Interface Software and Technology (UIST '02), pp. 51-60, 2002.
[43] E. Rukzio, M. Muller, and R. Hardy, “Design, Implementation and Evaluation of a Novel Public Display for Pedestrian Navigation: The Rotating Compass,” Proc. ACM Conf. Human Factors in Computing Systems (CHI '09), pp. 113-122, 2009.
[44] S. Brewster, F. Chohan, and L. Brown, “Tactile Feedback for Mobile Interactions,” Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI '07), pp. 159-162, 2007.
[45] S. Tyssy, J. Raisamo, and R. Raisamo, “Telling Time by Vibration,” Proc. Conf. Eurohaptics, M. Ferre, ed., pp. 924-929, 2008.
[46] J. Williamson, R. Murray-Smith, and S. Hughes, “Devices as Interactive Physical Containers: The Shoogle System,” CHI '07 Extended Abstracts on Human Factors in Computing Systems, pp.2013-2018, ACM Press, 2007.
[47] M. Enriquez, O. Afonin, B. Yager, and K.E. MacLean, “A Pneumatic Tactile Notification System for the Driving Environment,” Proc. Workshop Perceptive User Interfaces (PUI '01), pp. 1-7, 2001.
[48] P. Griffiths and R.B. Gillespie, “Shared Control between Human and Machine: Haptic Display of Automation during Manual Control of Vehicle Heading,” Proc. 12th Int'l Symp. Haptic Interfaces for Virtual Environment and Teleoperator Systems (HAPTICS '04), pp.358-366, 2004.
[49] B. Forsyth and K.E. MacLean, “Predictive Haptic Guidance: Intelligent User Assistance for the Control of Dynamic Tasks,” IEEE Trans. Visualization and Computer Graphics, vol. 12, no. 1, pp.103-113, Jan./Feb. 2006.
[50] M. Enriquez and K.E. MacLean, “Impact of Haptic Warning Signal Reliability in a Time-and-Safety-Critical Task,” Proc. 12th Ann. Symp. Haptic Interfaces for Virtual Environments and Teleoperator Systems (IEEE-VR '04), pp. 407-415, 2004.
[51] Immersion Corporation, “Immersion Automotive Interface Design,” http:/www.immersion.com, 2001.
[52] G. Michelitsch, J. Williams, M. Osen, B. Jimenez, and S. Rapp, “Haptic Chameleon: A New Concept of Shape-Changing User Interface Controls with Force Feedback,” Proc. ACM Conf. Extended Abstracts on Human Factors in Computing Systems (CHI '04), pp.1305-1308, 2004.
[53] H.Z. Tan, R. Gray, J.J. Young, and R. Traylor, “A Haptic Back Display for Attentional and Directional Cueing,” Haptics-e: The Electronic J. Haptics Research, vol. 3, no. 1, pp. 1-20, 2003.
[54] C. Ho, H.Z. Tan, and C. Spence, “Using Spatial Vibrotactile Cues to Direct Visual Attention in Driving Scenes,” Transportation Research Part F: Traffic Psychology and Behavior, vol. 8, pp. 397-412, 2005.
[55] A. Gallace, H.Z. Tan, and C. Spence, “The Body Surface as a Communication System: The State of the Art After 50 Years,” Presence: Teleoperators and Virtual Environments, vol. 16, no. 6, pp.655-676, 2007.
[56] H.Z. Tan and A. Pentland, “Tactual Displays for Sensory Substitution and Wearable Computers,” Fundamentals of Wearable Computers and Augmented Reality, W. Barfield and T. Caudell, eds., pp. 579-598, Lawrence Erlbaum Assoc., 2001.
[57] J.B. van Erp, H.A. van Veen, C. Jansen, and T. Dobbins, “Waypoint Navigation with a Vibrotactile Waist Belt,” ACM Trans. Applied Perception, vol. 2, no. 2, pp. 106-117, 2005.
[58] C. Jansen, A. Wennemers, W. Vos, and E. Groen, “Flytact: A Tactile Display Improves a Helicopter Pilot's Landing Performance in Degraded Visual Environments,” Proc. Conf. Eurohaptics, M. Ferre, ed., pp. 867-875, 2008.
[59] E. Gunther and S. OModhrain, “Cutaneous Grooves: Composing for the Sense of Touch,” J. New Music Research, vol. 32, pp. 369-381, 2003.
[60] T. Shibata, T. Mitsui, K. Wada, A. Touda, T. Kumasaka, K. Tagami, and K. Tanie, “Mental Commit Robot and Its Application to Therapy of Children,” Proc. IEEE/ASME Int'l Conf. Advanced Intelligent Mechatronics (AIM '01), vol. 2, pp. 1053-1058, 2001.
[61] W.D. Stiehl, J. Lieberman, C. Breazeal, L. Basel, L. Lalla, and M. Wolf, “Design of a Therapeutic Robotic Companion for Relational, Affective Touch,” Proc. IEEE Int'l Workshop Robot and Human Interactive Comm. (ROMAN '05), pp. 408-415, 2005.
[62] S. Yohanan and K.E. MacLean, “The Haptic Creature Project: Social Human-Robot Interaction through Affective Touch,” Proc. Reign of Katz and Dogz, Second AISB Symp. Role of Virtual Creatures in a Computerised Soc. (AISB '08), pp. 7-11, 2008.
[63] S. Yohanan and K. MacLean, “A Tool to Study Affective Touch: Goals and Design of the Haptic Creature,” Proc. ACM Conf. Human Factors in Computing Systems (CHI '09), work in progress, pp.4153-4158, 2009.
[64] K.E. MacLean and J.B. Roderick, “Smart Tangible Displays in the Everyday World: A Haptic Door Knob,” Proc. IEEE/ASME Intl Conf. Advanced Intelligent Mechatronics (AIM '99), pp. 203-208, 1999.
[65] A. Chan, K.E. MacLean, and J. McGrenere, “Designing Haptic Icons to Support Collaborative Turn-Taking,” Int'l J. Human Computer Studies, vol. 66, pp. 333-355, 2008.
[66] B. Fogg, L.D. Cutler, P. Arnold, and C. Eisbach, “Handjive: A Device for Interpersonal Haptic Entertainment,” Proc. Conf. Human Factors in Computing Systems, pp. 57-64, 1998.
[67] S. Brave and A. Dahley, “inTouch: A Medium for Haptic Interpersonal Communication,” Proc. Conf. Human Factors in Computing Systems (CHI '97), pp. 363-364, 1997.
[68] A. Chang, S. O'Modhrain, R. Jacob, E. Gunther, and H. Ishii, “Comtouch: Design of a Vibrotactile Communication Device,” Proc. Conf. Designing Interactive Systems (DIS '02), pp. 312-320, 2002.
[69] J. Smith and K.E. MacLean, “Communicating Emotion through a Haptic Link: Design Space and Methodology,” Int'l J. Human Computer Studies (IJHCS), special issue on affective evaluation-innovative approaches, vol. 65, no. 4, pp. 376-387, 2007.
[70] D. Vogel and R. Balakrishnan, “Interactive Public Ambient Displays: Transitioning from Implicit to Explicit, Public to Personal, Interaction with Multiple Users,” Proc. 17th Ann. ACM Symp. User Interface Software and Technology (UIST '04), pp. 137-146, 2004.
[71] Apollo Interactive, “Honda Creates Musical Roadway,” http://www.apollointeractive.com/blog/2008/ 09/22honda-creates-musical-roadway, 2008.
[72] Y. Visell, J.R. Cooperstock, B.L. Giordano, K. Franinovic, A. Law, S. McAdams, K. Jathal, and F. Fontana, “A Vibrotactile Device for Display of Virtual Ground Materials in Walking,” Proc. Conf. EuroHaptics, M. Ferre, ed., pp. 420-426, 2008.
[73] K. Chung, C. Chiu, X. Xiao, and P.-Y. Chi, “Stress Outsourced: A Haptic Social Network via Crowdsourcing,” Proc. ACM Conf. Human Factors in Computing Systems (CHI '09), pp. 2439-2448, 2009.
[74] D. DiFranco, G. Beauregard, and M. Srinivasan, “The Effect of Auditory Cues on the Haptic Perception of Stiffness in Virtual Environments,” Proc. Sixth Ann. Symp. Haptic Interfaces for Virtual Environments and Teleoperator Systems, ASME/IMECE, vol. DSC-61, pp. 17-22, 1997.
[75] P. Baudisch and G. Chu, “Back-of-Device Interaction Allows Creating Very Small Touch Devices,” Proc. ACM Conf. Human Factors in Computing Systems (CHI '09), pp. 1923-1932, 2009.
[76] N. Bardill and S. Hutchinson, “Animal-Assisted Therapy with Hospitalized Adolescents,” J. Child and Adolescent Psychiatry Nursing, vol. 10, no. 1, pp. 17-24, 1997.
[77] K.E. MacLean, “Designing with Haptic Feedback,” Proc. IEEE Int'l Conf. Robotics and Automation (ICRA '00), vol. 1, pp. 783-788, IEEE, 2000.
[78] N. Stedman, “The Blanket Project,” http://plaza.bunka.go.jp/english/festival/ 2005recommend/, 2005.
[79] K. MacLean, “Haptic Interaction Design for Everyday Interfaces,” Reviews of Human Factors and Ergonomics, M. Carswell, ed., pp. 149-194, Human Factors and Ergonomics Soc., 2008.
[80] D. Katz, The World of Touch. Erlbaum, 1925/1989.
[81] H.-Y. Yao and V. Hayward, “An Experiment on Length Perception with a Virtual Rolling Stone,” Proc. Conf. Eurohaptics, pp. 325-330, 2006.
[82] A. El Saddik, M. Orozco, Y. Asfaw, S. Shirmohammadi, and A. Adler, “A Novel Biometric System for Identification and Verification of Haptic Users,” IEEE Trans. Instrumentation and Measurement, vol. 56, no. 3, pp. 895-906, June 2007.
[83] P.D. Adamczyk and B.P. Bailey, “If Not Now, When? The Effects of Interruption at Different Moments within Task Execution,” Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI '04), pp. 271-278, 2004.
[84] N. Baron, “Adjusting the Volume: Technology and Multitasking in Discourse Control,” Handbook of Mobile Communication Studies, J.E. Katz, ed., pp. 177-193, MIT Press, 2008.
[85] C. Swindells, K.E. MacLean, and K.S. Booth, “Designing for Feel: Contrasts between Human and Automated Parametric Capture of Knob Physics,” IEEE Transactions on Haptics, preprint, 21 May 2009, doi: 10.1109/TOH.2009.23.
[86] N. Forrest, S. Baillie, and H. Tan, “Haptic Stiffness Identification by Veterinarians and Novices: A Comparison,” Proc. Third Worldhaptics Conf. (WHC '09), pp. 1-6, 2009.
[87] C. Ware, Information Visualization: Perception for Design. Morgan Kaufmann, 2004.
[88] C. Johnson, R. Moorhead, T. Munzner, H. Pfister, P. Rheingans, and T.S. Yoo, “NIH/NSF Visualization Research Challenges Report,” IEEE CS Press, 2006.
[89] J.J. Thomas and K.A. Cook, Illuminating the Path: The Research and Development Agenda for Visual Analytics. Nat'l Visualization and Analytics Center, 2005.
[90] C.D. Wickens, “Multiple Resources and Performance Prediction,” Theoretical Issues in Ergonomics Science, vol. 3, no. 2, pp. 159-177, 2002.
[91] T.D. Wickens, Elementary Signal Detection Theory. Oxford Univ. Press, 2002.
[92] H. Tan, N. Durlach, C. Reed, and W. Rabinowitz, “Information Transmission with a Multifinger Tactual Display,” Perception and Psychophysics, vol. 61, no. 6, pp. 993-1008, 1999.
[93] R. Rensink, “Visual Sensing without Seeing,” Psychological Science, vol. 15, no. 1, pp. 27-32, 2004.
[94] A. Chan, K.E. MacLean, and J. McGrenere, “Learning and Identifying Haptic Icons under Workload,” Proc. First WorldHaptics Conf. (WHC '05), pp. 432-439, 2005.
[95] A. Tang, P. McLachlan, K. Lowe, R.S. Chalapati, and K.E. MacLean, “Perceiving Ordinal Data Haptically under Workload,” Proc. Seventh Int'l Conf. Multimodal Interfaces (ICMI '05), pp. 244-251, 2005.
[96] N. Sarter, “Multiple-Resource Theory as a Basis for Multimodal Interface Design: Success Stories, Qualifications, and Research Needs,” Attention: From Theory to Practice, A. Kramer, D.Wiegmann, and A. Kirlik, eds., pp. 186-195, Oxford Univ. Press, 2007.
[97] E. Hoggan and S. Brewster, “Designing Audo and Tactile Crossmodal Icons for Mobile Devices,” Proc. Int'l Conf. Multimodal Interfaces (ICMI '07), pp. 162-169, 2007.
[98] C. Harrison and S. Hudson, “Providing Dynamically Changeable Physical Buttons on a Visual Display,” Proc. ACM Conf. Human Factors in Computing Systems (CHI '09), pp. 299-308, 2009.
[99] J.D. Lee, J. Hoffman, and E. Hayes, “Collision Warning Design to Mitigate Driver Distraction,” Proc. ACM Conf. Human Factors in Computing Systems (CHI '04), pp. 65-72, 2004.
[100] M.H. Jones, A. Arcelus, R. Goubran, and F. Knoefel, “A Pressure Sensitive Home Environment,” Proc. IEEE Int'l Workshop Haptic Audio Visual Environments and Their Applications (HAVE '06), 2006.
37 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool