The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.03 - March (2010 vol.32)
pp: 478-500
Dan Witzner Hansen , IT University, Copenhagen, Copenhagen
Qiang Ji , Rensselaer Polynechnic Institute, Troy
ABSTRACT
Despite active research and significant progress in the last 30 years, eye detection and tracking remains challenging due to the individuality of eyes, occlusion, variability in scale, location, and light conditions. Data on eye location and details of eye movements have numerous applications and are essential in face detection, biometric identification, and particular human-computer interaction tasks. This paper reviews current progress and state of the art in video-based eye detection and tracking in order to identify promising techniques as well as issues to be further addressed. We present a detailed review of recent eye models and techniques for eye detection and tracking. We also survey methods for gaze estimation and compare them based on their geometric properties and reported accuracies. This review shows that, despite their apparent simplicity, the development of a general eye detection technique involves addressing many challenges, requires further theoretical developments, and is consequently of interest to many other domains problems in computer vision and beyond.
INDEX TERMS
Eye, eye detection, eye tracking, gaze estimation, review paper, gaze tracking, object detection and tracking, human--computer interaction.
CITATION
Dan Witzner Hansen, Qiang Ji, "In the Eye of the Beholder: A Survey of Models for Eyes and Gaze", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.32, no. 3, pp. 478-500, March 2010, doi:10.1109/TPAMI.2009.30
REFERENCES
[1] J.S. Agustin, A. Villanueva, and R. Cabeza, “Pupil Brightness Variation as a Function of Gaze Direction,” Proc. 2006 Symp. Eye Tracking Research and Applications, pp. 49-49, 2006.
[2] A. Amir, L. Zimet, A. Sangiovanni-Vincentelli, and S. Kao, “An Embedded System for an Eye-Detection Sensor,” Computer Vision and Image Understanding, vol. 98, no. 1, pp. 104-123, 2005.
[3] G. Anders, “Pilot's Attention Allocation during Approach and Landing—Eye- and Head-Tracking Research,” Proc. 11th Int'l Symp. Aviation Psychology, 2001.
[4] J. Bala, K. De Jong, J. Huang, H. Vafaie, and H. Wechsler, “Visual Routine for Eye Detection Using Hybrid Genetic Architectures,” Proc. Int'l Conf. Pattern Recognition, 1996.
[5] L.-P. Bala, K. Talmi, and J. Liu, “Automatic Detection and Tracking of Faces and Facial Features in Video Sequences,” Proc. Picture Coding Symp., Sept. 1997.
[6] S. Baluja and D. Pomerleau, “Non-Intrusive Gaze Tracking Using Artificial Neural Networks,” Advances in Neural Information Processing Systems, J.D. Cowan, G. Tesauro, and J. Alspector, eds., vol. 6, pp. 753-760, Morgan Kaufmann Publishers, 1994.
[7] P. Baudisch, D.D. Andrew, T. Duchowski, and W.S. Geisler, “Focusing on the Essential: Considering Attention in Display Design,” Comm. ACM, vol. 46, no. 3, pp. 60-66, 2003.
[8] D. Beymer and M. Flickner, “Eye Gaze Tracking Using an Active Stereo Head,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, vol. II, pp. 451-458, 2003.
[9] G. Boening, K. Bartl, T. Dera, S. Bardins, E. Schneider, and T. Brandt, “Mobile Eye Tracking as a Basis for Real-Time Control of a Gaze Driven Head-Mounted Video Camera,” Proc. 2006 Symp. Eye Tracking Research and Applications, p. 56, 2006.
[10] R. Bolt, “Gaze-Orchestrated Dynamic Windows,” Proc. ACM SIGGRAPH, pp. 109-119, 1981.
[11] X.L.C. Brolly and J.B. Mulligan, “Implicit Calibration of a Remote Gaze Tracker,” Proc. 2004 Conf. Computer Vision and Pattern Recognition Workshop, vol. 8, p. 134, 2004.
[12] R.H.S. Carpenter, Movements of the Eyes. Pion Limited, 1988.
[13] G. Chow and X. Li, “Towards a System for Automatic Facial Feature Detection,” Pattern Recognition, vol. 26, pp. 1739-1755, 1993.
[14] COGAIN, http:/www.cogain.org/, 2007.
[15] C. Colombo and A.D. Bimbo, “Real-Time Head Tracking from the Deformation of Eye Contours Using a Piecewise Affine Camera,” Pattern Recognition Letters, vol. 20, no. 7, pp. 721-730, July 1999.
[16] D. Comaniciu, V. Ramesh, and P. Meer, “Kernel-Based Object Tracking,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 25, no. 5, pp. 564-577, May 2003.
[17] T.F. Cootes, G.J. Edwards, and C.J. Taylor, “Active Appearance Models,” Proc. European Conf. Computer Vision, vol. 2, pp. 484-498, 1998.
[18] T.F. Cootes and C.J Taylor, “Active Shape Models—‘Smart Snakes’”, Proc. British Machine Vision Conf., pp. 266-275, 1992.
[19] J. Coughlan, A. Yuille, C. English, and D. Snow, “Efficient Deformable Template Detection and Localization without User Initialization,” Computer Vision and Image Understanding, vol. 78, no. 3, pp. 303-319, 2000.
[20] F.L. Coutinho and C.H. Morimoto, “Free Head Motion Eye Gaze Tracking Using a Single Camera and Multiple Light Sources,” Proc. ACM SIGGRAPH, M.M. de Oliveira Neto and R. Lima Carceroni, eds., Oct. 2006.
[21] H. Crane and C. Steele, “Accurate Three-Dimensional Eye Tracker,” J. Optical Soc. Am., vol. 17, no. 5, pp. 691-705, 1978.
[22] D. Cristinacce and T. Cootes, “Feature Detection and Tracking with Constrained Local Models,” Proc. 17th British Machine Vision Conf., pp. 929-938, 2006.
[23] J.L. Crowley and F. Berard, “Multi-Modal Tracking of Faces for Video Communications,” Proc. 1997 IEEE CS Conf. Computer Vision and Pattern Recognition, pp. 640-645, 1997.
[24] J. Daugman, “The Importance of Being Random: Statistical Principles of Iris Recognition,” Pattern Recognition, vol. 36, no. 2, pp. 279-291, 2003.
[25] J. Deng and F. Lai, “Region-Based Template Deformation and Masking for Eye-Feature Extraction and Description,” Pattern Recognition, vol. 30, pp. 403-419, 1997.
[26] T. D'Orazio, M. Leo, G. Cicirelli, and A. Distante, “An Algorithm for Real Time Eye Detection in Face Images,” Proc. 17th Int'l Conf. Pattern Recognition, vol. 3, no. 0, pp. 278-281, 2004.
[27] D. Droege, C. Schmidt, and D. Paulus, “A Comparison of Pupil Centre Estimation Algorithms,” Proc. Conf. COGAIN 2008— Comm., Environment, and Mobility Control by Gaze, H. Istance, O.Stepankova, and R. Bates, eds., pp. 23-26, 2008.
[28] A. Duchowski, Eye Tracking Methodology: Theory and Practice. Springer-Verlag, 2003.
[29] A.T. Duchowski, E. Medlin, N. Cournia, A. Gramopadhye, B. Melloy, and S. Nair, “3D Eye Movement Analysis for VR Visual Inspection Training,” Proc. 2002 Symp. Eye Tracking Research and Applications, pp. 103-110, 2002.
[30] Y. Ebisawa, “Improved Video-Based Eye-Gaze Detection Method,” IEEE Trans. Instrumentation and Measurement, vol. 47, no. 2, pp. 948-955, Aug. 1998.
[31] Y. Ebisawa and S. Satoh, “Effectiveness of Pupil Area Detection Technique Using Two Light Sources and Image Difference Method,” Proc. 15th Ann. Int'l Conf. IEEE Eng. in Medicine and Biology Soc., pp. 1268-1269, 1993.
[32] Y. Ebisawa, “Realtime 3D Position Detection of Human Pupil,” Proc. 2004 IEEE Symp. Virtual Environments, Human-Computer Interfaces and Measurement Systems, pp. 8-12, 2004.
[33] G.J. Edwards, T.F. Cootes, and C.J. Taylor, “Face Recognition Using Active Appearance Models,” Proc. Fifth European Conf. Computer Vision, vol. 2, pp. 581-95, 1998.
[34] A. Tomono et al., “Pupil Extraction Processing and Gaze Point Detection System Allowing Head Movement,” Trans. Inst. of Electronics, Information, and Comm. Eng. of Japan, vol. J76-D II, no. 3, 1993.
[35] I.R. Fasel, B. Fortenberry, and J.R. Movellan, “A Generative Framework for Real Time Object Detection and Classification,” Computer Vision and Image Understanding, vol. 98, no. 1, pp. 182-210, Apr. 2005.
[36] G.C. Feng and P.C. Yuen, “Variance Projection Function and Its Application to Eye Detection for Human Face Recognition,” Int'l J. Computer Vision, vol. 19, pp. 899-906, 1998.
[37] G.C. Feng and P.C. Yuen, “Multi-Cues Eye Detection on Gray Intensity Image,” Pattern Recognition, vol. 34, pp. 1033-1046, 2001.
[38] R.S. Feris, T.E. de Campos, and R.M. CesarJr., “Detection and Tracking of Facial Features in Video Sequences,” Proc. Conf. Medical Image Computing and Computer-Assisted Intervention, pp. 127-135, 2000.
[39] V. Di Gesu and C. Valenti, “Symmetry Operators in Computer Vision,” Vistas Astronomy, vol. 40, no. 4, pp. 461-468, 1996.
[40] Y. Gofman and N. Kiryati, “Detecting Symmetry in Gray Level Images: The Global Optimization Approach,” Proc. Int'l Conf. Pattern Recognition, 1996.
[41] J.H. Goldberg and A.M. Wichansky, Eye Tracking in Usability Evaluation: A Practitioner's Guide, pp. 493-516. Elsevier Science, 2003.
[42] K. Grauman, M. Betke, J. Gips, and G.R. Bradski, “Communication via Eye Blinks: Detection and Duration Analysis in Real Time,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, vol. I, pp. 1010-1017, 2001.
[43] E.D. Guestrin and M. Eizenman, “General Theory of Remote Gaze Estimation Using the Pupil Center and Corneal Reflections,” IEEE Trans. Biomedical Eng., vol. 53, no. 6, pp. 1124-1133, June 2006.
[44] P.W. Hallinan, “Recognizing Human Eyes,” Geometric Methods in Computer Vision, pp. 212-226, SPIE, 1991.
[45] D.W. Hansen, “Comitting Eye Tracking,” PhD thesis, IT Univ. of Copenhagen, 2003.
[46] D. Witzner Hansen, J.P. Hansen, M. Nielsen, A.S. Johansen, and M.B. Stegmann, “Eye Typing Using Markov and Active Appearance Models,” Proc. IEEE Workshop Applications on Computer Vision, pp. 132-136, 2003.
[47] D.W. Hansen and A.E.C. Pece, “Eye Tracking in the Wild,” Computer Vision and Image Understanding, vol. 98, no. 1, pp. 182-210, Apr. 2005.
[48] D.W. Hansen, “Using Colors for Eye Tracking,” Color Image Processing: Methods and Applications, pp. 309-327, CRC Press, 2006.
[49] D.W. Hansen and R. Hammoud, “An Improved Likelihood Model for Eye Tracking,” Computer Vision and Image Understanding, 2007.
[50] D.W. Hansen and J.P. Hansen, “Robustifying Eye Interaction,” Proc. Conf. Vision for Human Computer Interaction, pp. 152-158, 2006.
[51] D.W. Hansen, H.H.T. Skovsgaard, J.P. Hansen, and E. Møllenbach, “Noise Tolerant Selection by Gaze-Controlled Pan and Zoom in 3D,” Proc. 2008 Symp. Eye Tracking Research and Applications, pp. 205-212, 2008.
[52] J.P. Hansen, K. Itoh, A.S. Johansen, K. Tørning, and A. Hirotaka, “Gaze Typing Compared with Input by Head and Hand,” Proc. Eye Tracking Research and Applications Symp. 2004, pp. 131-138, 2004.
[53] A. Haro, M. Flickner, and I. Essa, “Detecting and Tracking Eyes by Using Their Physiological Properties, Dynamics, and Appearance,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, 2000.
[54] J. Heinzmann and A. Zelinsky, “3D Facial Pose and Gaze Point Estimation Using a Robust Real-Timetracking Paradigm,” Proc. IEEE Int'l Conf. Automatic Face and Gesture Recognition, 1998.
[55] C. Hennessey, B. Noureddin, and P. Lawrence, “A Single Camera Eye-Gaze Tracking System with Free Head Motion,” Proc. Symp. Eye Tracking Research and Applications, pp. 87-94, 2006.
[56] C. Hennesy and P. Lawrence, “3D Point-of-Gaze Estimation on a Volumetric Display,” Proc. 2008 Symp. Eye Tracking Research and Applications, 2008.
[57] R. Herpers, M. Michaelis, K. Lichtenauer, and G. Sommer, “Edge and Keypoint Detection in Facial Regions,” Proc. Int'l Conf. Automatic Face and Gesture-Recognition, pp. 212-217, 1996.
[58] P.M. Hillman, J.M. Hannah, and P.M. Grant, “Global Fitting of a Facial Model to Facial Features for Model-Based Video Coding,” Proc. Third Int'l Symp. Image and Signal Processing and Analysis, vol. 1, pp. 359-364, 2003.
[59] J. Huang, D. Ii, X. Shao, and H. Wechsler, “Pose Discrimination and Eye Detection Using Support Vector Machines (SVMs),” Proc. Conf. NATO-ASI on Face Recognition: From Theory to Applications, pp. 528-536, 1998.
[60] J. Huang and H. Wechsler, “Eye Location Using Genetic Algorithms,” Proc. Second Int'l Conf. Audio and Video-Based Biometric Person Authentication, 1999.
[61] J. Huang and H. Wechsler, “Eye Detection Using Optimal Wavelet Packets and Radial Basis Functions (RBFs),” Int'l J. Pattern Recognition and Artificial Intelligence, vol. 13, no. 7, 1999.
[62] T.E. Hutchinson, “Human-Computer Interaction Using Eye-Gaze Input,” IEEE Tran. Systems, Man, and Cybernetics, vol. 19, no. 6, pp. 1527-1534, Nov./Dec. 1989.
[63] K. Hyoki, M. Shigeta, N. Tsuno, Y. Kawamuro, and T. Kinoshita, “Quantitative Electro-Oculography and Electroencephalography as Indices of Alertness,” Electroencephalography and Clinical Neurophysiology, vol. 106, pp. 213-219, 1998.
[64] A. Hyrskykari, P. Majaranta, A. Aaltonen, and K.-J. Räihä, “Design Issues of iDict: A Gaze-Assisted Translation Aid,” Proc. Symp. Eye Tracking Research and Applications 2000, pp. 9-14, 2000.
[65] A. Hyrskykari, P. Majaranta, and K.-J. Räihä, “Proactive Response to Eye Movements,” Proc. IFIP TC13 Int'l Conf. Human-Computer Interaction, M. Rauterberg, M. Menozzi, and J. Wesson, eds., pp.129-136, 2003.
[66] A. Hyrskykari, P. Majaranta, and K.-J. Räihä, “From Gaze Control to Attentive Interfaces,” Proc. 11th Int'l Conf. Human-Computer Interaction, 2005.
[67] T. Ishikawa, S. Baker, I. Matthews, and T. Kanade, “Passive Driver Gaze Tracking with Active Appearance Models,” Proc. 11th World Congress Intelligent Transportation Systems, Oct. 2004.
[68] J.P. Ivins and J. Porrill, “A Deformable Model of the Human Iris for Measuring Small 3-Dimensional Eye Movements,” Machine Vision and Applications, vol. 11, no. 1, pp. 42-51, 1998.
[69] Q. Ji and X. Yang, “Real-Time Eye, Gaze, and Face Pose Tracking for Monitoring Driver Vigilance,” Real-Time Imaging, vol. 8, no. 5, pp. 357-377, 2002.
[70] Q. Ji and Z. Zhu, “Eye and Gaze Tracking for Interactive Graphic Display,” Proc. Second Int'l Symp. Smart Graphics, pp. 79-85, 2002.
[71] M. Kampmann and L. Zhang, “Estimation of Eye, Eyebrow and Nose Features in Videophone Sequences,” Proc. Int'l Workshop Very Low Bitrate Video Coding, 1998.
[72] J.J. Kang, E.D. Guestrin, and E. Eizenman, “Investigation of the Cross-Ratio Method for Point-of-Gaze Estimation,” Trans. Biometrical Eng., 2008.
[73] F. Karmali and M. Shelhamer, “Compensating for Camera Translation in Video Eye Movement Recordings by Tracking a Landmark Selected Automatically by a Genetic Algorithm,” Proc. Ann. Int'l Conf. IEEE Eng. in Medicine and Biology, pp. 5298-5301 and 4029752, 2006.
[74] S. Kawato and N. Tetsutani, “Detection and Tracking of Eyes for Gaze-Camera Control,” Proc. Int'l Conf. Vision Interface, p. 348, 2002.
[75] S. Kawato and J. Ohya, “Real-Time Detection of Nodding and Head-Shaking by Directly Detecting and Tracking the Between-Eyes,” Proc. IEEE Fourth Int'l Conf. Automatic Face and Gesture Recognition, pp. 40-45, 2000.
[76] S. Kawato and J. Ohya, “Two-Step Approach for Real-Time Eye Tracking with a New Filtering Technique,” Proc. Int'l Conf. System, Man and Cybernetics, pp. 1366-1371, 2000.
[77] S. Kawato and N. Tetsutani, “Detection and Tracking of Eyes for Gaze-Camera Control,” Proc. 15th Int'l Conf. Vision Interface, 2002.
[78] S. Kawato and N. Tetsutani, “Real-Time Detection of Between-the-Eyes with a Circle Frequency Filter,” Proc. Asian Conf. Computer Vision '02, vol. II, pp. 442-447, 2002.
[79] K.-N. Kim and R.S. Ramakrishna, “Vision-Based Eye-Gaze Tracking for Human Computer Interface,” Proc. IEEE Int'l Conf. Systems, Man, and Cybernetics, 1999.
[80] S. Kim and Q. Ji, “Non-Intrusive Eye Gaze Tracking under Natural Head Movements,” Proc. 26th Ann. Int'l Conf. IEEE Eng. in Medicine and Biology, Sept. 2004.
[81] C. Kimme, D. Ballard, and J. Sklansky, “Finding Circles by an Array of Accumulators,” Comm. ACM, vol. 18, no. 2, pp. 120-122, Feb. 1975.
[82] I. King and L. Xu, “Localized Principal Component Analysis Learning for Face Feature Extraction and Recognition,” Proc. Workshop 3D Computer Vision, pp. 124-128, 1997.
[83] S.M. Kolakowski and J.B. Pelz, “Compensating for Eye Tracker Camera Movement,” Proc. 2006 Symp. Eye Tracking Research and Applications, pp. 79-85, 2006.
[84] R. Kothari and J.L. Mitchell, “Detection of Eye Locations in Unconstrained Visual Images,” Proc. Int'l Conf. Image Processing, vol. 3, pp. 519-522, 1996.
[85] P. Kovesi, “Symmetry and Asymmetry from Local Phase,” Proc. 10th Australian Joint Conf. Artificial Intelligence, pp. 185-190, 1997.
[86] K. Lam and H. Yan, “Locating and Extracting the Eye in Human Face Images,” Pattern Recognition, vol. 29, pp. 771-779, 1996.
[87] C. Lankford, “Effective Eye-Gaze Input into Windows,” Proc. Eye Tracking Research and Applications Symp. '00, pp. 23-27, 2000.
[88] J.L. Levine, “An Eye-Controlled Computer,” Technical Report RC-8857, IBM T.J. Watson Research Center, 1982.
[89] D. Li, D. Winfield, and D.J. Parkhurst, “Starburst: A Hybrid Algorithm for Video-Based Eye Tracking Combining Feature-Based and Model-Based Approaches,” Proc. Vision for Human-Computer Interaction Workshop, IEEE Computer Vision and Pattern Recognition Conf., 2005.
[90] C.-C. Lin and W.-C. Lin, “Extracting Facial Features by an Inhibitory Mechanism Based on Gradient Distribution,” Pattern Recognition, vol. 29, no. 12, pp. 2079-2101, 1996.
[91] P.J. Locher and C.F. Nodine, “Symmetry Catches the Eye,” Eye Movements—from Physiology to Cognition, pp. 353-361, 1987.
[92] P.J. Locher and C.F. Nodine, “The Perceptual Value of Symmetry,” Computers and Math. Applications, vol. 17, pp. 475-484, 1989.
[93] G. Loy and A. Zelinsky, “Fast Radial Symmetry for Detecting Points of Interest” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 25, no. 8, pp. 959-973, Aug. 2003.
[94] B.D. Lucas and T. Kanade, “An Iterative Image Registration Technique with an Application to Stereo Vision,” Proc. Int'l Joint Conf. Artificial Intelligence, 1981.
[95] P. Majaranta and K.-J. Räihä, “Twenty Years of Eye Typing: Systems and Design Issues,” Proc. Symp. Eye Tracking Research and Applications, pp. 15-22, 2002.
[96] T. Marui and Y. Ebisawa, “Eye Searching Technique for Video-Based Eye-Gaze Detection,” Proc. 20th Ann. Int'l Conf. IEEE Eng. in Medicine and Biology Soc. '98, vol. 2, pp. 744-747, 1998.
[97] Y. Matsumoto, T. Ogasawara, and A. Zelinsky, “Behaviour Recognition Based on Head Pose and Gaze Direction Measurement,” Proc. IEEE/RSJ Int'l Conf. Intelligent Robots and Systems, pp.2127-2132, 2000.
[98] Y. Matsumoto and A. Zelinsky, “An Algorithm for Real-Time Stereo Vision Implementation of Head Pose and Gaze Direction Measurement,” Proc. Int'l Conf. Automatic Face and Gesture Recognition, pp. 499-504, 2000.
[99] J. Merchant, R. Morrissette, and J. Porterfield, “Remote Measurements of Eye Direction Allowing Subject Motion over One Cubic Foot of Space,” IEEE Trans. Biomedical Eng., vol. 21, no. 4, pp. 309-317, July 1974.
[100] A. Meyer, M. Böhme, T. Martinetz, and E. Barth, “A Single-Camera Remote Eye Tracker,” Perception and Interactive Technologies, pp. 208-211, Springer, 2006.
[101] J.M. Miller, H.L. Hall, J.E Greivenkamp, and D.L. Guyton, “Quantification of the Brückner Test for Strabismus,” Investigative Ophthalmology & Visual Science, vol. 36, no. 4, pp. 897-905, 1995.
[102] W.M. Huang and R. Mariani, “Face Detection and Precise Eyes Location,” Proc. Int'l Conf. Pattern Recognition, 2000.
[103] C.H. Morimoto, D. Koons, A. Amir, and M. Flickner, “Pupil Detection and Tracking Using Multiple Light Sources,” Image and Vision Computing, vol. 18, no. 4, pp. 331-335, 2000.
[104] C.H. Morimoto and M.R.M. Mimica, “Eye Gaze Tracking Techniques for Interactive Applications,” Computer Vision and Image Understanding, vol. 98, no. 1, pp. 4-24, Apr. 2005.
[105] C. ´Morimoto, A. Amir, and M. Flickner, “Detecting Eye Position and Gaze from a Single Camera and 2 Light Sources,” Proc. Int'l Conf. Pattern Recognition, 2002.
[106] C.H. Morimoto and M. Flickner, “Real-Time Multiple Face Detection Using Active Illumination,” Proc. Fourth IEEE Int'l Conf. Automatic Face and Gesture Recognition '00, 2000.
[107] C.H. Morimoto, D. Koons, A. Amir, and M. Flickner, “Pupil Detection and Tracking Using Multiple Light Sources,” Technical Report RJ-10117, IBM Almaden Research Center, 1998.
[108] P. Müller, D. Cavegn, G. d'Ydewalle, and R. Groner, “A Comparison of a New Limbus Tracker, Corneal Reflection Technique, Purkinje Eye Tracking and Electro-Oculography,” Perception and Cognition, G. d'Ydewalle and J.V. Rensbergen, eds., pp. 393-401, Elsevier Science Publishers, 1993.
[109] R. Newman, Y. Matsumoto, S. Rougeaux, and A. Zelinsky, “Real-Time Stereo Tracking for Head Pose and Gaze Estimation,” Proc. Int'l Conf. Automatic Face and Gesture Recognition, pp. 122-128, 2000.
[110] K. Nguyen, C. Wagner, D. Koons, and M. Flickner, “Differences in the Infrared Bright Pupil Response of Human Eyes,” Proc. Symp. Eye Tracking Research and Applications, pp. 133-138, 2002.
[111] M. Nixon, “Eye Spacing Measurement for Facial Recognition,” Proc. Conf. Soc. Photo-Optical Instrument Eng., 1985.
[112] B. Noureddin, P.D. Lawrence, and C.F. Man, “A Non-Contact Device for Tracking Gaze in a Human Computer Interface,” Computer Vision and Image Understanding, vol. 98, no. 1, pp. 52-82, 2005.
[113] T. Ohno and N. Mukawa, “A Free-Head, Simple Calibration, Gaze Tracking System That Enables Gaze-Based Interaction,” Proc. Eye Tracking Research and Applications Symp. 2004, pp. 115-122, 2004.
[114] T. Ohno, “One-Point Calibration Gaze Tracking Method,” Proc. 2006 Symp. Eye Tracking Research and Applications, pp. 34-34, 2006.
[115] T. Ohno, N. Mukawa, and A. Yoshikawa, “Freegaze: A Gaze Tracking System for Everyday Gaze Interaction,” Proc. Eye Tracking Research Applications Symp., pp. 125-132, 2002.
[116] K.R. Park, J.J. Lee, and J. Kim, “Facial and Eye Gaze Detection,” Biologically Motivated Computer Vision, pp. 368-376, Springer, 2002.
[117] A. Pentland, B. Moghaddam, and T. Starner, “View-Based and Modular Eigenspaces for Face Recognition,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, June 1994.
[118] A. Perez, M.L. Cordoba, A. Garcia, R. Mendez, M.L. Munoz, J.L. Pedraza, and F. Sanchez, “A Precise Eye-Gaze Detection and Tracking System,” J. WSCG, pp. 105-108, 2003.
[119] K. Rayner, C. Rotello, C. Steward, A. Keir, and A. Duffy, “When Looking at Print Advertisements,” J. Experimental Psychology: Applied, vol. 7, no. 3, pp. 219-226, 2001.
[120] M. Reinders, R. Koch, and J. Gerbrands, “Locating Facial Features in Image Sequences Using Neural Networks,” Proc. Second Int'l Conf. Automatic Face and Gesture Recognition, 1997.
[121] D. Reisfeld and Y. Yeshurun, “Robust Detection of Facial Features by Generalized Symmetry,” Proc. Int'l Conf. Pattern Recognition, vol. I, pp. 117-120, 1992.
[122] Arrington Research, http:/www.arringtonresearch.com, 2007.
[123] RPI, http://www.ecse.rpi.edu/homepp./cvrl/database database.html, 2008.
[124] F. Samaria and S. Young, “HMM-Based Architecture for Face Identification,” Image and Vision Computing, vol. 12, no. 8, pp. 537-543, 1994.
[125] D. Scott and J. Findlay, “Visual Search, Eye Movements and Display Units,” human factors report, 1993.
[126] G. Sela and M.D. Levine, “Real-Time Attention for Robotic Vision,” Real-Time Imaging, vol. 3, pp. 173-194, 1997.
[127] S.-W. Shih and J. Liu, “A Novel Approach to 3D Gaze Tracking Using Stereo Cameras,” IEEE Trans. Systems, Man, and Cybernetics, vol. 34, no. 1, pp. 234-245, Feb. 2004.
[128] S.-W. Shih, Y.-T. Wu, and J. Liu, “A Calibration-Free Gaze Tracking Technique,” Proc. 15th Int'l Conf. Pattern Recognition, pp. 201-204, 2000.
[129] S. Sirohey, A. Rosenfeld, and Z. Duric, “A Method of Detecting and Tracking Irises and Eyelids in Video,” Pattern Recognition, vol. 35, no. 6, pp. 1389-1401, June 2002.
[130] S.A. Sirohey and A. Rosenfeld, “Eye Detection in a Face Image Using Linear and Nonlinear Filters,” Pattern Recognition, vol. 34, pp. 1367-1391, 2001.
[131] D.M. Stampe, “Heuristic Filtering and Reliable Calibration Methods for Video-Based Pupil-Tracking Systems,” Behaviour Research Methods, Instruments & Computers, vol. 25, no. 2, pp. 137-142, 1993.
[132] R. Stiefelhagen, J. Yang, and A. Waibel, “A Model-Based Gaze Tracking System,” Proc. IEEE Int'l Joint Symp. Intelligence and Systems, pp. 304-310, 1996.
[133] R. Stiefelhagen, J. Yang, and A. Waibel, “Tracking Eyes and Monitoring Eye Gaze,” Proc. Workshop Perceptual User Interfaces, pp. 98-100, 1997.
[134] A. Sugioka, Y. Ebisawa, and M. Ohtani, “Noncontact Video-Based Eye-Gaze Detection Method Allowing Large Head Displacements,” Proc. Ann. Int'l Conf. IEEE Eng. in Medicine and Biology, vol. 2, pp. 526-528, 1996.
[135] K. Talmi and J. Liu, “Eye and Gaze Tracking for Visually Controlled Interactive Stereoscopic Displays,” Signal Processing: Image Comm., vol. 14, no. 10, pp. 799-810, 1999.
[136] K.-H. Tan, D.J. Kriegman, and N. Ahuja, “Appearance-Based Eye Gaze Estimation,” Proc. Sixth IEEE Workshop Applications of Computer Vision '02, pp. 191-195, 2002.
[137] J.H. Ten Kate, E.E.E. Frietman, W. Willems, B.M. Ter Haar Romeny, and E. Tenkink, “Eye-Switch Controlled Communication Aids,” Proc. 12th Int'l Conf. Medical and Biological Eng., 1979.
[138] Y. Tian, T. Kanade, and J.F. Cohn, “Dual-State Parametric Eye Tracking,” Proc. Fourth IEEE Int'l Conf. Automatic Face and Gesture Recognition, 2000.
[139] A. Tomono, M. Iida, and Y. Kobayashi, “A TB Camera System which Extracts Feature Points for Non-Contact Eye Movement Detection,” Proc. SPIE Optics, Illumination, and Image Sensing for Machine Vision, pp. 2-12, 1989.
[140] D. Tweed and T. Vilis, “Geometric Relations of Eye Position and Velocity Vectors during Saccades,” Vision Research, vol. 30, no. 1, pp. 111-127, 1990.
[141] G. Underwood, Cognitive Processes in Eye Guidance. Oxford Univ. Press, 2005.
[142] R. Valenti and T. Gevers, “Accurate Eye Center Location and Tracking Using Isophote Curvature,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, 2008.
[143] R. Vertegaal, I. Weevers, and C. Sohn, “GAZE-2: An Attentive Video Conferencing System,” Proc. CHI '02: Extended Abstracts on Human Factors in Computing Systems, pp. 736-737, 2002.
[144] A. Villanueva, R. Cabeza, and S. Porta, “Eye Tracking: Pupil Orientation Geometrical Modeling,” Image and Vision Computing, vol. 24, no. 7, pp. 663-679, July 2006.
[145] A. Villanueva, R. Cabeza, and S. Porta, “Gaze Tracking System Model Based on Physical Parameter,” Int'l J. Pattern Recognition and Artificial Intelligence, 2007.
[146] A. Villanueva and R. Cabeza, “Models for Gaze Tracking Systems,” J. Image and Video Processing, vol. 2007, no. 3, 2007.
[147] P. Viola and M. Jones, “Robust Real-Time Face Detection,” Proc. Int'l Conf. Computer Vision, vol. II, p. 747, 2001.
[148] R. Wagner and H.L Galiana, “Evaluation of Three Template Matching Algorithms for Registering Images of the Eye,” IEEE Trans. Biomedical Eng., vol. 12, pp. 1313-1319, Dec. 1992.
[149] J. Waite and J.M. Vincent, “A Probabilistic Framework for Neural Network Facial Feature Location,” British Telecom Technology J., vol. 10, no. 3, pp. 20-29, 1992.
[150] J.G. Wang and E. Sung, “Gaze Determination via Images of Irises,” Image and Vision Computing, vol. 19, no. 12, pp. 891-911, 2001.
[151] J.G. Wang, E. Sung, and R. Venkateswarlu, “Estimating the Eye Gaze from One Eye,” Computer Vision and Image Understanding, vol. 98, no. 1, pp. 83-103, Apr. 2005.
[152] P. Wang, M.B. Green, Q. Ji, and J. Wayman, “Automatic Eye Detection and Its Validation,” Proc. 2005 IEEE CS Conf. Computer Vision and Pattern Recognition, vol. 3, pp. 164-164, 2005.
[153] P. Wang and Q. Ji, “Learning Discriminant Features for Multi-View Face and Eye Detection,” Proc. IEEE CS Conf. Computer Vision and Pattern Recognition,, vol. 1, pp. 373-379, 2005.
[154] D.J. Ward and D.J.C. MacKay, “Fast Hands-Free Writing by Gaze Direction,” Nature, vol. 418, no. 6900, p. 838, 2002.
[155] M. Wedel and R. Peiters, “Eye Fixations on Advertisements and Memory for Brands: A Model and Findings,” Marketing Science, vol. 19, no. 4, pp. 297-312, 2000.
[156] K.P. WhiteJr., T.E. Hutchinson, and J.M. Carley, “Spatially Dynamic Calibration of an Eye-Tracking System,” IEEE Trans. Systems, Man, and Cybernetics, vol. 23, no. 4, pp. 1162-1168, July/Aug. 1993.
[157] O.M.C. Williams, A. Blake, and R. Cipolla, “Sparse and Semi-Supervised Visual Mapping with the s$^{3}$ p,” Proc. IEEE CS Conf. Computer Vision and Pattern Recognition, pp. 230-237, 2006.
[158] X. Xie, R. Sudhakar, and H. Zhuang, “On Improving Eye Feature-Extraction Using Deformable Templates,” Pattern Recognition, vol. 27, no. 6, pp. 791-799, June 1994.
[159] X. Xie, R. Sudhakar, and H. Zhuang, “A Cascaded Scheme for Eye Tracking and Head Movement Compensation,” IEEE Trans. Systems, Man, and Cybernetics, vol. 28, no. 4, pp. 487-490, July 1998.
[160] L.Q. Xu, D. Machin, and P. Sheppard, “A Novel Approach to Real-Time Non-Intrusive Gaze Finding,” Proc. British Machine Vision Conf., 1998.
[161] H. Yamazoe, A. Utsumi, T. Yonezawa, and S. Abe, “Remote Gaze Estimation with a Single Camera Based on Facial-Feature Tracking without Special Calibration Actions,” Proc. 2008 Symp. Eye Tracking Research and Applications, pp. 140-145, 2008.
[162] J. Yang, R. Stiefelhagen, U. Meier, and A. Waibel, “Real-Time Face and Facial Feature Tracking and Applications,” Proc. Conf. Auditory-Visual Speech Processing, pp. 79-84, 1998.
[163] A. Yarbus, Eye Movements and Vision. Plenum Press, 1967.
[164] D.H. Yoo and M.J. Chung, “A Novel Non-Intrusive Eye Gaze Estimation Using Cross-Ratio under Large Head Motion,” Computer Vision and Image Understanding, vol. 98, no. 1, pp. 25-51, Apr. 2005.
[165] D. Young, H. Tunley, and R. Samuels, “Specialised Hough Transform and Active Contour Methods for Real-Time Eye Tracking,” Technical Report 386, School of Cognitive and Computing Sciences, Univ. of Sussex, 1995.
[166] A. Yuille, P. Hallinan, and D. Cohen, “Feature Extraction from Faces Using Deformable Templates,” Int'l J. Computer Vision, vol. 8, no. 2, pp. 99-111, 1992.
[167] L. Zhang, “Estimation of Eye and Mouth Corner Point Positions in a Knowledge-Based Coding System,” Proc. SPIE, pp. 21-18, 1996.
[168] Z. Zhu, K. Fujimura, and Q. Ji, “Real-Time Eye Detection and Tracking under Various Light Conditions,” Proc. Eye Tracking Research and Applications Symp., 2002.
[169] Z. Zhu, Q. Ji, and K. Fujimura, “Combining Kalman Filtering and Mean Shift for Real Time Eye Tracking” Proc. Int'l Conf. Pattern Recognition, vol. IV, pp. 318-321, 2002.
[170] Z. Zhu and Q. Ji, “Novel Eye Gaze Tracking Techniques under Natural Head Movement,” IEEE Trans. Biomedical Eng., vol. 54, no. 12, pp. 2246-2260, Dec. 2007.
[171] Z. Zhu, Q. Ji, and K.P. Bennett, “Nonlinear Eye Gaze Mapping Function Estimation via Support Vector Regression,” Proc. 18th Int'l Conf. Pattern Recognition, vol. 1, pp. 1132-1135, 2006.
7 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool