This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
World Haptics 2009 - Third Joint EuroHaptics conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems
Tactile displays for multitask environments: The role of concurrent task processing code
Salt Lake City, UT, USA
March 18-March 20
ISBN: 978-1-4244-3858-7
Thomas Ferris, Center for Ergonomics, Department of Industrial and Operations Engineering, University of Michigan, Ann Arbor, USA
Shameem Hameed, Center for Ergonomics, Department of Industrial and Operations Engineering, University of Michigan, Ann Arbor, USA
Nadine Sarter, Center for Ergonomics, Department of Industrial and Operations Engineering, University of Michigan, Ann Arbor, USA
The distribution of tasks and stimuli across multiple modalities has been proposed as a means to support multitasking in data-rich environments. In particular, the tactile channel and, more specifically, communication via the use of tactile/haptic icons have received considerable interest in recent years. Past research has examined primarily the impact of concurrent task modality on the effectiveness of tactile information presentation. In contrast, the present study investigates to what extent the interpretation of complex tactile patterns - “tactons” - is affected by another attribute of information: the processing code of concurrent tasks. Participants decoded tactons composed of temporal patterns of vibrations (categorical data) - and concurrently interpreted one of two types of visual task stimuli - requiring either spatial or categorical processing - in a driving simulation. Compared to single-task performance, both dual-task conditions showed a performance decrement. As predicted by Multiple Resource Theory, this decrement was significantly larger when the tacton task was paired with the visual task requiring categorical (as compared to spatial) processing. The findings from this study can serve as input to multidimensional quantitative models of timesharing performance. From an applied perspective, the results suggest that nonspatially-encoded tactons would be preferable in environments which rely heavily on spatial processing, such as car cockpits or flight decks.
Citation:
Thomas Ferris, Shameem Hameed, Nadine Sarter, "Tactile displays for multitask environments: The role of concurrent task processing code," whc, pp.160-165, World Haptics 2009 - Third Joint EuroHaptics conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2009
Usage of this product signifies your acceptance of the Terms of Use.