Issue No. 04 - April (2013 vol. 19)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TVCG.2013.36
K. Ponto , Dept. of Comput. Sci., Univ. of Wisconsin, Madison, WI, USA
M. Gleicher , Dept. of Comput. Sci., Univ. of Wisconsin, Madison, WI, USA
R. G. Radwin , Dept. of Biomed. Eng., Univ. of Wisconsin, Madison, WI, USA
Hyun Joon Shin , Div. of Digital Media, Ajou Univ., Suwon, South Korea
The perception of objects, depth, and distance has been repeatedly shown to be divergent between virtual and physical environments. We hypothesize that many of these discrepancies stem from incorrect geometric viewing parameters, specifically that physical measurements of eye position are insufficiently precise to provide proper viewing parameters. In this paper, we introduce a perceptual calibration procedure derived from geometric models. While most research has used geometric models to predict perceptual errors, we instead use these models inversely to determine perceptually correct viewing parameters. We study the advantages of these new psychophysically determined viewing parameters compared to the commonly used measured viewing parameters in an experiment with 20 subjects. The perceptually calibrated viewing parameters for the subjects generally produced new virtual eye positions that were wider and deeper than standard practices would estimate. Our study shows that perceptually calibrated viewing parameters can significantly improve depth acuity, distance estimation, and the perception of shape.
Calibration, Solid modeling, Estimation, Shape, Virtual environments, Cameras
K. Ponto, M. Gleicher, R. G. Radwin and Hyun Joon Shin, "Perceptual Calibration for Immersive Display Environments," in IEEE Transactions on Visualization & Computer Graphics, vol. 19, no. 4, pp. 691-700, 2013.