Subscribe

Issue No.04 - October-December (2010 vol.3)

pp: 234-244

Massimiliano Di Luca , Max Planck Institute for Biological Cybernetics , Tübingen

Martin Buss , Technische Universität München, München

Roberta L. Klatzky , Carnegie Mellon University, Pittsburgh

DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TOH.2010.9

ABSTRACT

The compliance of a material can be conveyed through mechanical interactions in a virtual environment and perceived through both visual and haptic cues. We investigated this basic aspect of perception. In two experiments, subjects performed compliance discriminations, and the mean perceptual estimate (PSE) and the perceptual standard deviation (proportional to JND) were derived from psychophysical functions. Experiment 1 supported a model in which each modality acted independently to produce a compliance estimate, and the two estimates were then integrated to produce an overall value. Experiment 2 tested three mathematical models of the integration process. The data ruled out exclusive reliance on the more reliable modality and stochastic selection of one modality. Instead, the results supported an integration process that constitutes a weighted summation of two random variables, which are defined by the single modality estimates. The model subsumes optimal fusion but provided valid predictions also if the weights were not optimal. Weights were optimal (i.e., minimized variance) when vision and haptic inputs were congruent, but not when they were incongruent.

INDEX TERMS

Perception, haptic, vision, integration, optimal fusion, combination, human system interface, virtual reality.

CITATION

Massimiliano Di Luca, Martin Buss, Roberta L. Klatzky, "Combination and Integration in the Perception of Visual-Haptic Compliance Information",

*IEEE Transactions on Haptics*, vol.3, no. 4, pp. 234-244, October-December 2010, doi:10.1109/TOH.2010.9REFERENCES