This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Appearance-Based Gaze Estimation Using Visual Saliency
Feb. 2013 (vol. 35 no. 2)
pp. 329-341
Y. Sugano, Sato Lab., Univ. of Tokyo, Tokyo, Japan
Y. Matsushita, Microsoft Res. Asia, Beijing, China
Y. Sato, Sato Lab., Univ. of Tokyo, Tokyo, Japan
We propose a gaze sensing method using visual saliency maps that does not need explicit personal calibration. Our goal is to create a gaze estimator using only the eye images captured from a person watching a video clip. Our method treats the saliency maps of the video frames as the probability distributions of the gaze points. We aggregate the saliency maps based on the similarity in eye images to efficiently identify the gaze points from the saliency maps. We establish a mapping between the eye images to the gaze points by using Gaussian process regression. In addition, we use a feedback loop from the gaze estimator to refine the gaze probability maps to improve the accuracy of the gaze estimation. The experimental results show that the proposed method works well with different people and video clips and achieves a 3.5-degree accuracy, which is sufficient for estimating a user's attention on a display.
Index Terms:
statistical distributions,computer vision,eye,face recognition,feedback,Gaussian processes,gesture recognition,object recognition,regression analysis,user attention estimation,appearance-based gaze estimation,gaze sensing method,visual saliency map,eye image capture,video frames,probability distribution,eye image similarity,gaze point identification,Gaussian process regression,feedback loop,gaze probability map,Visualization,Estimation,Calibration,Feature extraction,Accuracy,Face,Humans,face and gesture recognition,Gaze estimation,visual attention
Citation:
Y. Sugano, Y. Matsushita, Y. Sato, "Appearance-Based Gaze Estimation Using Visual Saliency," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 35, no. 2, pp. 329-341, Feb. 2013, doi:10.1109/TPAMI.2012.101
Usage of this product signifies your acceptance of the Terms of Use.