JULY-SEPTEMBER 2006 (Vol. 13, No. 3) pp. 22-23
1070-986X/06/$31.00 © 2006 IEEE
Published by the IEEE Computer Society
Published by the IEEE Computer Society
Guest Editors' Introduction: Haptic User Interfaces for Multimedia Systems
|In this issue|
|One last touch|
PDFs Require Adobe Acrobat
A fundamental goal of human-computer interfaces (HCI) is to facilitate veridical interactions between humans and computers. Touch-based interfaces are referred to as haptic interfaces, where haptic denotes the sense of touch. Haptic devices such as haptic gloves, joysticks, and tactile arrays generate a range of force and tactile feedback. Haptic cueing and messaging systems and haptic spatial representation systems integrate seamlessly with existing multimodal applications. Applications of haptic user interfaces include telerobotics, surgical simulations and medical training, assistive and rehabilitative devices, museum displays, and augmented reality interfaces for gaming and other applications.
This special issue provides insight into the design, development, and usability testing of haptic user interfaces. Several papers were submitted and it was difficult to choose a select few. However, the final, published articles span a variety of application domains and research foci.
In this issue
A basic sensory ability of humans is spatial perception. Sensory systems facilitate complex interactions with objects both within and beyond the reach of physical contact. Haptic, visual, auditory, and olfactory senses contribute toward spatial perception. The significance of touch in spatial perception has been a debated issue. Researchers have argued that the haptic modality plays a supporting role at best—reinforcing perception from the visual modality and in limited cases providing augmentative cues about the physical world.
The first article in this special issue by Robles-De-La-Torre justifies the development of haptic user interfaces by providing examples of the importance of haptics in everyday activities. The author emphasizes that sensory environments that lack haptic interactions cannot provide realistic feedback and therefore do not provide a holistic spatial perception.
The second article in this issue presents a haptic system for telerehabilitation of patients suffering from upper limb dysfunction. Neurologists, rehabilitation specialists, and psychologists have developed numerous schemes for individuals with sensory, neural, and physical impediments that are based on haptic stimulation and touch therapy. With the advent of computational haptics and haptic interfaces, it's now possible to develop systems that can provide haptic stimulation targeting specific symptoms. The system proposed by Jadhav et al. enables delivery of personalized rehabilitation schemes for patients by incorporating telehaptics in the exercise/movement protocol.
The next three articles pertain to surgical simulation and learning systems. Surgeons use multimodal sensory-motor cues to perform complex procedures in both open surgery and minimally invasive surgery. The use of multimodal systems with haptics provides surgeons with effective tools to train, perform, and evaluate surgical procedures.
Tsagarakis et al. present a multimodal system for preoperative planning for total hip arthroplasty. Hip arthroplasty involves replacing diseased hip joints with artificial parts. It is a multistage procedure involving haptic perception. This article outlines a force-feedback system that aids in performing the surgery.
Nakao et al. have focused on developing multimodal systems to promote learning in surgical environments. The touch and feel of surgical tools, organs, and bioelasticity is central to teaching surgical procedures. The authors have proposed a learning system for bioelasticity wherein experts set elastic parameters of a virtual aorta palpation system that's experienced and learned by novice surgeons. The study focuses on haptic components of multimodal systems and on imparting haptic skills to learners.
Wongwirat et al. present a system for telesurgery and telehaptics targeting synchronization control. Telesurgery systems permit surgeons to conduct surgery at remote locations through a robotic system controlled over the network. While there are significant advantages to telesurgery, network delays, jitter, and synchronization pose serious operational constraints. Inefficient network processing of telesurgery can lead to compromised patient safety and is hence unacceptable. This article addresses a number of issues that pertain to the safety, efficiency, and realism of distributed haptic systems.
The article by Raisamo et al. presents a methodology to evaluate assistive multimodal systems for individuals who are blind. Usability testing of assistive human-computer interfaces poses a special challenge for designers and testers. The authors present a comprehensive methodology consisting of questionnaires, interviews, and a variety of observation methods focusing on iterative evaluations of assistive systems. This article also examines the role of objective and subjective evaluations of haptic assistive systems.
One last touch
As a final remark, we would like to note that haptics is indeed a varied and dynamic field—one with much promise not only for the research community, but also for the people whose live haptics could physically improve. We look forward to seeing further developments in the near future.
We would like to thank the authors for their contributions and the reviewers for their valuable feedback. We are grateful to former Editor in Chief Forouzan Golshani for his valuable support. We also thank the editorial staff at IEEE MultiMedia for their support and efficient interactions with the authors.
Sethuraman Panchanathan is the director of the School of Computing and Informatics at Arizona State University (ASU). He also serves as the chair of the Departments of Computer Science and Engineering and Biomedical Informatics, and is the director of the Center for Cognitive Ubiquitous Computing. His research interests include multimedia processing and multimedia information systems, assistive devices, human movement analysis, and haptic user interfaces. He is a fellow of the IEEE and SPIE. He is currently the Editor in Chief of IEEE MultiMedia.
Kanav Kahol is an assistant research professor at ASU. He is also a research scientist at the SimET Center at the Banner Good Samaritan Hospital in Phoenix, Arizona. He is the co-director of the Haptics Group at the Center for Cognitive Ubiquitous Computing at ASU. His primary research interests include haptic user interfaces, human movement analysis, human-computer interfaces, multimedia information systems, and virtual reality and rehabilitation.