This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
Remote and head-motion-free gaze tracking for real environments with automated head-eye model calibrations
Anchorage, AK, USA
June 23-June 28
ISBN: 978-1-4244-2339-2
Hirotake Yamazoe, ATR Intelligent Robotics and Communication Laboratories, 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan
Akira Utsumi, ATR Intelligent Robotics and Communication Laboratories, 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan
Tomoko Yonezawa, ATR Intelligent Robotics and Communication Laboratories, 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan
Shinji Abe, ATR Intelligent Robotics and Communication Laboratories, 2-2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto 619-0288, Japan
We propose a gaze estimation method that substantially relaxes the practical constraints possessed by most conventional methods. Gaze estimation research has a long history, and many systems including some commercial schemes have been proposed. However, the application domain of gaze estimation is still limited (e.g, measurement devices for HCI issues, input devices for VDT works) due to the limitations of such systems. First, users must be close to the system (or must wear it) since most systems employ IR illumination and/or stereo cameras. Second, users are required to perform manual calibrations to get geometrically meaningful data. These limitations prevent applications of the system that capture and utilize useful human gaze information in daily situations. In our method, inspired by a bundled adjustment framework, the parameters of the 3D head-eye model are robustly estimated by minimizing pixel-wise re-projection errors between single-camera input images and eye model projections for multiple frames with adjacently estimated head poses. Since this process runs automatically, users does not need to be aware of it. Using the estimated parameters, 3D head poses and gaze directions for newly observed images can be directly determined with the same error minimization manner. This mechanism enables robust gaze estimation with single-camera-based low resolution images without user-aware preparation tasks (i.e., calibration). Experimental results show the proposed method achieves 6° accuracy with QVGA (320 × 240) images. The proposed algorithm is free from observation distances. We confirmed that our system works with longdistance observations (10 meters).
Citation:
Hirotake Yamazoe, Akira Utsumi, Tomoko Yonezawa, Shinji Abe, "Remote and head-motion-free gaze tracking for real environments with automated head-eye model calibrations," cvprw, pp.1-6, 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 2008
Usage of this product signifies your acceptance of the Terms of Use.