• Publication
  • 2011
  • Issue No. 7 - July
  • Abstract - A Wide-View Parallax-Free Eye-Mark Recorder with a Hyperboloidal Half-Silvered Mirror and Appearance-Based Gaze Estimation
 This Article 
 Bibliographic References 
 Add to: 
A Wide-View Parallax-Free Eye-Mark Recorder with a Hyperboloidal Half-Silvered Mirror and Appearance-Based Gaze Estimation
July 2011 (vol. 17 no. 7)
pp. 900-912
Hiroki Mori, Osaka University, Toyonaka
Erika Sumiya, Osaka University, Toyonaka
Tomohiro Mashita, Osaka University, Toyonaka
Kiyoshi Kiyokawa, Osaka University, Toyonaka
Haruo Takemura, Osaka University, Toyonaka
In this paper, we propose a wide-view parallax-free eye-mark recorder with a hyperboloidal half-silvered mirror and a gaze estimation method suitable for the device. Our eye-mark recorder provides a wide field-of-view video recording of the user's exact view by positioning the focal point of the mirror at the user's viewpoint. The vertical angle of view of the prototype is 122 degree (elevation and depression angles are 38 and 84 degree, respectively) and its horizontal view angle is 116 degree (nasal and temporal view angles are 38 and 78 degree, respectively). We implemented and evaluated a gaze estimation method for our eye-mark recorder. We use an appearance-based approach for our eye-mark recorder to support a wide field-of-view. We apply principal component analysis (PCA) and multiple regression analysis (MRA) to determine the relationship between the captured images and their corresponding gaze points. Experimental results verify that our eye-mark recorder successfully captures a wide field-of-view of a user and estimates gaze direction with an angular accuracy of around 2 to 4 degree.

[1] A.T. Duchowski, Eye Tracking Methodology: Theory and Practice, second ed. Springer, 2007.
[2] K.S. Rutley, "An Eye-Mark Camera for Use in Driver Behaviour Studies," Medical and Biological Eng. and Computing, vol. 10, no. 1, pp. 101-103, 1972.
[3] S. Hodges, L. Williams, E. Berry, S. Izadi, J. Srinivasan, A. Butler, G. Smyth, N. Kapur, and K. Wood, "SenseCam: A Retrospective Memory Aid," Proc. Eighth Int'l Conf. Ubiquitous Computing (Ubicomp), pp. 177-193, 2006.
[4] Y. Yagi, S. Kawato, and S. Tsuji, "Real-Time Omnidirectional Image Sensor (COPIS) for Vision-Guided Navigation," IEEE Trans. Robotics and Automation, vol. 10, no. 1, pp. 11-22, Feb. 1994.
[5] K. Yamazawa, Y. Yagi, and M. Yachida, "Omnidirectional Imaging with Hyperboloidal Projection," Proc. IEEE/RSJ Int'l Conf. Intelligent Robots and Systems (IROS), vol. 2, pp. 1029-1034, 1993.
[6] S.K. Nayar, "Catadioptric Omnidirectional Camera," Proc. IEEE Int'l Conf. Computer Vision and Pattern Recognition (CVPR), pp. 482-488, 1997.
[7] A. State, K.P. Keller, and H. Fuchs, "Simulation-Based Design and Rapid Prototyping of a Parallax-Free, Orthoscopic Video See-Through Head-Mounted Display," Proc. Fourth IEEE/ACM Int'l Symp. Mixed and Augmented Reality (ISMAR), pp. 28-31, 2005.
[8] K. Kondo, Y. Mukaigawa, and Y. Yagi, "Wearable Imaging System for Capturing Omnidirectional Movies from a First-Person Perspective," Proc. 16th ACM Symp. Virtual Reality Software and Technology (VRST), pp. 11-18, 2009.
[9] K. Kondo, Y. Mukaigawa, and Y. Yagi, "Integrability-Based Free-Form Mirror Design," IPSJ Trans. Computer Vision and Applications, vol. 1, pp. 158-173, 2009.
[10] K. Yamazawa, H. Takemura, and N. Yokoya, "Telepresence System with an Omnidirectional HD Camera," Proc. Fifth Asian Conf. Computer Vision (ACCV), vol. 2, pp. 533-538, 2002.
[11] C. Hennessey, B. Noureddin, and P. Lawrence, "A Single Camera Eye-Gaze Tracking System with Free Head Motion," Proc. 2006 Symp. Eye Tracking Research & Applications (ETRA), pp. 87-94, 2006.
[12] T. Ohno, N. Mukawa, and S. Kawato, "Just Blink Your Eyes: A Head-Free Gaze Tracking System," Proc. Conf. Human Factors in Computing Systems (CHI), pp. 950-951, 2003.
[13] C.H. Morimoto and M.R.M. Mimica, "Eye Gaze Tracking Techniques for Interactive Applications," Computer Vision and Image Understanding, vol. 98, no. 1, pp. 4-24, 2005.
[14] Y. Matsumoto and A. Zelinsky, "An Algorithm for Real-Time Stereo Vision Implementation of Head Pose and Gaze Direction Measurement," Proc. Int'l Conf. Automatic Face and Gesture Recognition (FG), pp. 499-504, 2000.
[15] T. Miyake, S. Haruta, and S. Horihata, "Image Based Eye-Gaze Estimation Irrespective of Head Direction," Proc. IEEE Int'l Symp. Industrial Electronics, vol. 1, pp. 332-336, 2002.
[16] T. Ishikawa, S. Baker, I. Matthews, and T. Kanade, "Passive Driver Gaze Tracking with Active Appearance Models," Proc. 11th World Congress Intelligent Transportation Systems, 2004.
[17] H. Yamazoe, A. Utsumi, T. Yonezawa, and S. Abe, "Remote Gaze Estimation with a Single Camera Based on Facial-Feature Tracking without Special Calibration Actions," Proc. 2008 Symp. Eye Tracking Research & Applications (ETRA), pp. 245-250, 2008.
[18] S. Baluja and D. Pomerleau, "Non-Intrusive Gazetracking Using Artificial Neural Networks," Technical Report CMU-CS-94-102, Carnegie Mellon Univ., 1994.
[19] B. Shiele and A. Waibel, "Gaze Tracking Based on Facecolor," Proc. Int'l Workshop Automatic Face and Gesture Recognition, pp. 344-349, 1995.
[20] L.P. Morency, C. Chrristoudias, and T. Darrell, "Recognizing Gaze Aversion Gestures in Embodied Conversational Discourse," Proc. Int'l Conf. Multimodal Interfaces (ICMI), pp. 287-294, 2006.
[21] Y. Ono, T. Okabe, and Y. Sato, "Gaze Estimation from Low Resolution Images," Proc. IEEE Pacific-Rim Symp. Image and Video Technology (PSIVT), pp. 178-188, 2006.
[22] N. Otsu, "A Threshold Selection Method from Gray Level Histograms," IEEE Trans. Systems, Man and Cybernetics, vol. 9, pp. 62-66, 1979.

Index Terms:
Eye-mark recorder, parallax free, wide field-of-view, gaze estimation, half-silvered hyperboloidal mirror, head-mounted camera.
Hiroki Mori, Erika Sumiya, Tomohiro Mashita, Kiyoshi Kiyokawa, Haruo Takemura, "A Wide-View Parallax-Free Eye-Mark Recorder with a Hyperboloidal Half-Silvered Mirror and Appearance-Based Gaze Estimation," IEEE Transactions on Visualization and Computer Graphics, vol. 17, no. 7, pp. 900-912, July 2011, doi:10.1109/TVCG.2010.113
Usage of this product signifies your acceptance of the Terms of Use.