This special section is devoted to selected extended versions of papers that were presented at the World Haptics conference in Salt Lake City, Utah, in March 2009. This biennial conference is a joint venture of the IEEE Haptics Symposium and the Eurohaptics conference, bringing together haptics researchers from around the world. The World Haptics conference is a single-track conference and thus provides a valuable opportunity for scientists and engineers to explore the breadth of research activities in this very interdisciplinary field.
An open call for extended manuscripts based on conference papers was presented at World Haptics. The papers in this special section were selected from a number of submissions made in response to the call and cover the broad range of topics addressed at the conference. We would like to express our thanks to the reviewers who examined the papers and the associate editors who handled the reviewing process. Our job was greatly facilitated by their support as well as the enthusiasm and dedication of the editor-in-chief, Ed Colgate, to whom we express our gratitude. Acknowledgements are also due to the Associate Editors-in-Chief Susan Lederman and Domenico Prattichizzo.
In the opening article by Takayuki Hoshi, Masafumi Takahashi, Takayuki Iwamoto, and Hiroyuki Shinoda, a noncontact tactile display is described that is based on an array of ultrasound transducers. By controlling the phase delays of multiple transducers, the acoustic radiation pressure can be focused on a specific point in the air. The prototype device consists of 324 transducers; a total force of 16 mN was achieved at the focal point which had a diameter of 20 mm. With 1 kHz bandwidth, the device can provide various tactile feelings on the hand of a user. The proposed device is well suited for the tactile display to be combined with floating images. A trial of a multimodal system involving the tactile display, a floating image display, and a hand tracking device is presented.
The second paper by Karlin Bark, Jason Wheeler, Pete Shull, Joan Savall, and Mark Cutkosky focuses on the physical and psychophysical characterization of a wearable skin stretch device employing two rotating end-effectors. Users were able to discriminate rotational displacements with high accuracy in a controlled experimental environment. Performance in a more realistic setting was slightly reduced, but could be improved with training. The results indicate that rotational feedback could be an appropriate means for providing tactile stimuli to convey abstract information, for example in rehabilitation settings.
The work of Brian T. Gleeson, Scott K. Horschel, and William R. Provancher follows a similar line of research. The authors performed experiments with a tactile display that provided tangential skin displacement. They showed that participants could identify the direction of skin displacement with high accuracy—at 0.2 mm displacement, accuracy was 95 percent correct. The first experiment also indicated that a priming effect of direction repetition might have resulted in an increase in detection accuracy; however, this hypothesis could not be confirmed in a follow-up study. Nevertheless, the results indicate that such small stimuli could be useful in hand-held miniature display devices, such as mobile phones.
The paper by Erik C. Chubb, J. Edward Colgate, and Michael A. Peshkin describes a new haptic surface called ShiverPAD, which is essentially a variable friction device based on the squeeze film effect, oscillating (or “shivering”) in-plane. By modulating the squeeze film effect according to the direction of horizontal vibration, the device can generate shear forces on a bare finger regardless of the direction of finger motion. Finger position is measured using an optical sensor, which enables the device to generate any arbitrary force field. Two interesting applications of the device—a toggle switch and edge-like contours that human subjects can easily follow—are presented in the paper. The concept of ShiverPAD could be used for haptic interactions with touchscreen-type interfaces in the future.
In the final article by Thomas K. Ferris and Nadine Sarter the effect of processing code interference is examined in two experiments using a driving simulator. In these studies, participants decoded vibrotactile icons that were either spatial or nonspatial patterns of tactile stimulation while they interpreted spatial and nonspatial visual stimuli associated with the driving task. Performance in interpreting tactile icons was significantly affected by the concurrent performance of the visual task, independent of whether the task was a spatial or nonspatial one. However, the performance decrement associated with concurrent task performance was greater when both the visual and the tactile tasks involved spatial or nonspatial processing resources. These findings suggest that competition for processing code resources needs to be considered in the development of multisensory displays.
L. Jones is with the Department of Mechanical Engineering, MIT, Room 3-137, 77 Massachusetts Avenue, Cambridge, MA 02139. E-mail: email@example.com.
M. Harders is with the Virtual Reality in Medicine Group, Computer Vision Laboratory, Sternwartenstrasse 7, Room ETF C107, ETH Zurich, CH-8092 Zurich, Switzerland. E-mail: firstname.lastname@example.org
Y. Yokokohji is with the Department of Mechanical Engineering, Graduate School of Engineering, Kobe University, 1-1 Rokkodai-cho, Nada-ku, Kobe 657-8501, Japan. E-mail: email@example.com.
For information on obtaining reprints of this article, please send e-mail to: firstname.lastname@example.org.
received the PhD degree from McGill University and, after doing a postdoctoral fellowship at the Montreal Neurological Institute, joined the faculty at McGill University where she taught for eight years. In 1994, she moved to the Massachusetts Institute of Technology where she is now a senior research scientist in the Department of Mechanical Engineering. Her research is focused on the development of wearable wireless-controlled vibrotactile displays that can be used for navigation and communication in real and simulated environments. An additional research area is the development of thermal displays that can be used to facilitate object identification in virtual environments. She is a member of the Society for Neuroscience and the American Association for the Advancement of Science and a senior member of the IEEE.
studied computer science with a focus on medical informatics at the University of Hildesheim, Germany, Technical University of Braunschweig, Germany, and the University of Houston, Texas. He finished his doctoral thesis and his habilitation at ETH Zürich, Switzerland, in 2002 and 2007, respectively. Currently, he is a lecturer and senior researcher in the Computer Vision Lab at ETH Zürich as well as leader of the Virtual Reality in Medicine Group. His research focuses on haptic interaction, surgical simulation, and human computer interfaces in medicine. He is a cofounder of the EuroHaptics conference and Society, the IEEE RAS/CS Technical Committee on Haptics, the IEEE Transactions on Haptics
, as well as the spin-off company VirtaMed. He is a member of the IEEE.
received the BS and MS degrees in precision engineering in 1984 and 1986, respectively, and the PhD degrees in mechanical engineering in 1991, all from Kyoto University, Kyoto, Japan. From 1988 to 1992, he was a research associate at Kyoto University. From 1992 to 2005, he was an associate professor in the Department of Mechanical Engineering, Kyoto University. From 2005 to 2009, he was an associate professor in the Department of Mechanical Engineering and Science, Kyoto University. From 1994 to 1996, he was a visiting research scholar at the Robotics Institute, Carnegie Mellon University, Pittsburgh, Pennsyvlania. In 2009, he moved to Kobe University where he is currently a professor in the Department of Mechanical Engineering, Graduate School of Engineering, Kobe University. His current research interests are robotics and virtual reality including teleoperation systems and haptic interfaces. Dr. Yokokohji is a member of the Institute of Systems, Control, and Information Engineers (Japan), the Robotics Society of Japan, the Society of Instruments and Control Engineers (Japan), the Japan Society of Mechanical Engineers, the Society of Biomechanisms of Japan, the Virtual Reality Society of Japan, and the IEEE.