Pages: pp. 153-154
This special section is devoted to selected extended versions of papers that were presented at the World Haptics conference in Salt Lake City, Utah, in March 2009. This biennial conference is a joint venture of the IEEE Haptics Symposium and the Eurohaptics conference, bringing together haptics researchers from around the world. The World Haptics conference is a single-track conference and thus provides a valuable opportunity for scientists and engineers to explore the breadth of research activities in this very interdisciplinary field.
An open call for extended manuscripts based on conference papers was presented at World Haptics. The papers in this special section were selected from a number of submissions made in response to the call and cover the broad range of topics addressed at the conference. We would like to express our thanks to the reviewers who examined the papers and the associate editors who handled the reviewing process. Our job was greatly facilitated by their support as well as the enthusiasm and dedication of the editor-in-chief, Ed Colgate, to whom we express our gratitude. Acknowledgements are also due to the Associate Editors-in-Chief Susan Lederman and Domenico Prattichizzo.
In the opening article by Takayuki Hoshi, Masafumi Takahashi, Takayuki Iwamoto, and Hiroyuki Shinoda, a noncontact tactile display is described that is based on an array of ultrasound transducers. By controlling the phase delays of multiple transducers, the acoustic radiation pressure can be focused on a specific point in the air. The prototype device consists of 324 transducers; a total force of 16 mN was achieved at the focal point which had a diameter of 20 mm. With 1 kHz bandwidth, the device can provide various tactile feelings on the hand of a user. The proposed device is well suited for the tactile display to be combined with floating images. A trial of a multimodal system involving the tactile display, a floating image display, and a hand tracking device is presented.
The second paper by Karlin Bark, Jason Wheeler, Pete Shull, Joan Savall, and Mark Cutkosky focuses on the physical and psychophysical characterization of a wearable skin stretch device employing two rotating end-effectors. Users were able to discriminate rotational displacements with high accuracy in a controlled experimental environment. Performance in a more realistic setting was slightly reduced, but could be improved with training. The results indicate that rotational feedback could be an appropriate means for providing tactile stimuli to convey abstract information, for example in rehabilitation settings.
The work of Brian T. Gleeson, Scott K. Horschel, and William R. Provancher follows a similar line of research. The authors performed experiments with a tactile display that provided tangential skin displacement. They showed that participants could identify the direction of skin displacement with high accuracy—at 0.2 mm displacement, accuracy was 95 percent correct. The first experiment also indicated that a priming effect of direction repetition might have resulted in an increase in detection accuracy; however, this hypothesis could not be confirmed in a follow-up study. Nevertheless, the results indicate that such small stimuli could be useful in hand-held miniature display devices, such as mobile phones.
The paper by Erik C. Chubb, J. Edward Colgate, and Michael A. Peshkin describes a new haptic surface called ShiverPAD, which is essentially a variable friction device based on the squeeze film effect, oscillating (or “shivering”) in-plane. By modulating the squeeze film effect according to the direction of horizontal vibration, the device can generate shear forces on a bare finger regardless of the direction of finger motion. Finger position is measured using an optical sensor, which enables the device to generate any arbitrary force field. Two interesting applications of the device—a toggle switch and edge-like contours that human subjects can easily follow—are presented in the paper. The concept of ShiverPAD could be used for haptic interactions with touchscreen-type interfaces in the future.
In the final article by Thomas K. Ferris and Nadine Sarter the effect of processing code interference is examined in two experiments using a driving simulator. In these studies, participants decoded vibrotactile icons that were either spatial or nonspatial patterns of tactile stimulation while they interpreted spatial and nonspatial visual stimuli associated with the driving task. Performance in interpreting tactile icons was significantly affected by the concurrent performance of the visual task, independent of whether the task was a spatial or nonspatial one. However, the performance decrement associated with concurrent task performance was greater when both the visual and the tactile tasks involved spatial or nonspatial processing resources. These findings suggest that competition for processing code resources needs to be considered in the development of multisensory displays.