2013 17th International Conference on Information Visualisation (2002)
July 10, 2002 to July 12, 2002
R. Dan Jacobson , Florida State University
The research discussed here is a component of a larger study to explore the accessibility and usability of spatial data presented through multiple sensory modalities including haptic, auditory, and visual interfaces. Geographical Information Systems (GIS) and other computer-based tools for spatial display predominantly use vision to communicate information to the user, as sight is the spatial sense par excellence. Ongoing research is exploring the fundamental concepts and techniques necessary to navigate through multimodal interfaces, which are user, task, domain, and interface specific. This highlights the necessity for both a conceptual / theoretical schema, and the need for extensive usability studies. Preliminary results presented here exploring feature recognition, and shape tracing in non-visual environments indicate multimodal interfaces have a great deal of potential for facilitating access to spatial data for blind and visually impaired persons. The research is undertaken with the wider goals of increasing information accessibility and promoting "universal access".
R. Dan Jacobson, "Representing Spatial Information through Multimodal Interfaces", 2013 17th International Conference on Information Visualisation, vol. 00, no. , pp. 730, 2002, doi:10.1109/IV.2002.1028858