The Community for Technology Leaders
2012 Second International Conference on 3D Imaging, Modeling, Processing, Visualization & Transmission (2012)
Zurich, Switzerland Switzerland
Oct. 13, 2012 to Oct. 15, 2012
ISBN: 978-1-4673-4470-8
pp: 184-191
ABSTRACT
We report on progress in the development of an online system to allow the estimation of eye tracked 3D gaze depth in a stereoscopic environment. The efficacy of our method is demonstrated in two distinctly different configurations: a custom Wheatstone stereoscope, and a commodity active stereo graphics display. We employ a 3D calibration process that determines the parameters of an affine mapping from a depth estimate, based on triangulation using onscreen horizontal disparity, to a refined depth estimate. Triangulation accounts for most of the non-linearity in the transform, and our calibration accounts for individual differences in eye separation and vergence behavior. We demonstrate that, although different binocular eye trackers are used with each display, our approach to online estimation of gaze depth performs similarly on both systems. Importantly, results show that depth estimation error is biased toward the display screen, underestimating target distance from the screen both behind and in front of the screen.
INDEX TERMS
vergence, eye tracking, stereoscopy
CITATION

R. I. Wang, B. Pelfrey, A. T. Duchowski and D. H. House, "Online Gaze Disparity via Bioncular Eye Tracking on Stereoscopic Displays," 2012 Second International Conference on 3D Imaging, Modeling, Processing, Visualization & Transmission(3DIMPVT), Zurich, Switzerland Switzerland, 2012, pp. 184-191.
doi:10.1109/3DIMPVT.2012.37
92 ms
(Ver 3.3 (11022016))