The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.04 - April (2011 vol.33)
pp: 741-753
Andreas Bulling , University of Cambridge, Cambridge and Lancaster University, Lancaster
Jamie A. Ward , Lancaster University, Lancaster
Hans Gellersen , Lancaster University, Lancaster
Gerhard Tröster , Swiss Federal Institute of Technology (ETH), Zurich
ABSTRACT
In this work, we investigate eye movement analysis as a new sensing modality for activity recognition. Eye movement data were recorded using an electrooculography (EOG) system. We first describe and evaluate algorithms for detecting three eye movement characteristics from EOG signals—saccades, fixations, and blinks—and propose a method for assessing repetitive patterns of eye movements. We then devise 90 different features based on these characteristics and select a subset of them using minimum redundancy maximum relevance (mRMR) feature selection. We validate the method using an eight participant study in an office environment using an example set of five activity classes: copying a text, reading a printed paper, taking handwritten notes, watching a video, and browsing the Web. We also include periods with no specific activity (the NULL class). Using a support vector machine (SVM) classifier and person-independent (leave-one-person-out) training, we obtain an average precision of 76.1 percent and recall of 70.5 percent over all classes and participants. The work demonstrates the promise of eye-based activity recognition (EAR) and opens up discussion on the wider applicability of EAR to other activities that are difficult, or even impossible, to detect using common sensing modalities.
INDEX TERMS
Ubiquitous computing, feature evaluation and selection, pattern analysis, signal processing.
CITATION
Andreas Bulling, Jamie A. Ward, Hans Gellersen, Gerhard Tröster, "Eye Movement Analysis for Activity Recognition Using Electrooculography", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.33, no. 4, pp. 741-753, April 2011, doi:10.1109/TPAMI.2010.86
REFERENCES
[1] S. Mitra and T. Acharya, "Gesture Recognition: A Survey," IEEE Trans. Systems, Man, and Cybernetics, Part C: Applications and Rev., vol. 37, no. 3, pp. 311-324, May 2007.
[2] P. Turaga, R. Chellappa, V.S. Subrahmanian, and O. Udrea, "Machine Recognition of Human Activities: A Survey," IEEE Trans. Circuits and Systems for Video Technology, vol. 18, no. 11, pp. 1473-1488, Nov. 2008.
[3] B. Najafi, K. Aminian, A. Paraschiv-Ionescu, F. Loew, C.J. Bula, and P. Robert, "Ambulatory System for Human Motion Analysis Using a Kinematic Sensor: Monitoring of Daily Physical Activity in the Elderly," IEEE Trans. Biomedical Eng., vol. 50, no. 6, pp. 711-723, June 2003.
[4] J.A. Ward, P. Lukowicz, G. Tröster, and T.E. Starner, "Activity Recognition of Assembly Tasks Using Body-Worn Microphones and Accelerometers," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 28, no. 10, pp. 1553-1567, Oct. 2006.
[5] N. Kern, B. Schiele, and A. Schmidt, "Recognizing Context for Annotating a Live Life Recording," Personal and Ubiquitous Computing, vol. 11, no. 4, pp. 251-263, 2007.
[6] A. Bulling, J.A. Ward, H. Gellersen, and G. Tröster, "Robust Recognition of Reading Activity in Transit Using Wearable Electrooculography," Proc. Sixth Int'l Conf. Pervasive Computing, pp. 19-37, 2008.
[7] S.P. Liversedge and J.M. Findlay, "Saccadic Eye Movements and Cognition," Trends in Cognitive Sciences, vol. 4, no. 1, pp. 6-14, 2000.
[8] J.M. Henderson, "Human Gaze Control during Real-World Scene Perception," Trends in Cognitive Sciences, vol. 7, no. 11, pp. 498-504, 2003.
[9] A. Bulling, D. Roggen, and G. Tröster, "Wearable EOG Goggles: Seamless Sensing and Context-Awareness in Everyday Environments," J. Ambient Intelligence and Smart Environments, vol. 1, no. 2, pp. 157-171, 2009.
[10] A. Bulling, J.A. Ward, H. Gellersen, and G. Tröster, "Eye Movement Analysis for Activity Recognition," Proc. 11th Int'l Conf. Ubiquitous Computing, pp. 41-50, 2009.
[11] Q. Ding, K. Tong, and G. Li, "Development of an EOG (Electro-Oculography) Based Human-Computer Interface," Proc. 27th Int'l Conf. Eng. in Medicine and Biology Soc., pp. 6829-6831, 2005.
[12] Y. Chen and W.S. Newman, "A Human-Robot Interface Based on Electrooculography," Proc. IEEE Int'l Conf. Robotics and Automation, vol. 1, pp. 243-248, 2004.
[13] W.S. Wijesoma, K.S. Wee, O.C. Wee, A.P. Balasuriya, K.T. San, and K.K. Soon, "EOG Based Control of Mobile Assistive Platforms for the Severely Disabled," Proc. IEEE Int'l Conf. Robotics and Biomimetics, pp. 490-494, 2005.
[14] R. Barea, L. Boquete, M. Mazo, and E. Lopez, "System for Assisted Mobility Using Eye Movements Based on Electrooculography," IEEE Trans. Neural Systems and Rehabilitation Eng., vol. 10, no. 4, pp. 209-218, Dec. 2002.
[15] M.M. Hayhoe and D.H. Ballard, "Eye Movements in Natural Behavior," Trends in Cognitive Sciences, vol. 9, pp. 188-194, 2005.
[16] S.S. Hacisalihzade, L.W. Stark, and J.S. Allen, "Visual Perception and Sequences of Eye Movement Fixations: A Stochastic Modeling Approach," IEEE Trans. Systems, Man, and Cybernetics, vol. 22, no. 3, pp. 474-481, May/June 1992.
[17] M. Elhelw, M. Nicolaou, A. Chung, G.-Z. Yang, and M.S. Atkins, "A Gaze-Based Study for Investigating the Perception of Visual Realism in Simulated Scenes," ACM Trans. Applied Perception, vol. 5, no. 1, pp. 1-20, 2008.
[18] L. Dempere-Marco, X. Hu, S.L.S. MacDonald, S.M. Ellis, D.M. Hansell, and G.-Z. Yang, "The Use of Visual Search for Knowledge Gathering in Image Decision Support," IEEE Trans. Medical Imaging, vol. 21, no. 7, pp. 741-754, July 2002.
[19] D.D. Salvucci and J.R. Anderson, "Automated Eye-Movement Protocol Analysis," Human-Computer Interaction, vol. 16, no. 1, pp. 39-86, 2001.
[20] D. Abowd, A. Dey, R. Orr, and J. Brotherton, "Context-Awareness in Wearable and Ubiquitous Computing," Virtual Reality, vol. 3, no. 3, pp. 200-211, 1998.
[21] L. Bao and S.S. Intille, "Activity Recognition from User-Annotated Acceleration Data," Proc. Second Int'l Conf. Pervasive Computing, pp. 1-17, 2004.
[22] B. Logan, J. Healey, M. Philipose, E. Tapia, and S.S. Intille, "A Long-Term Evaluation of Sensing Modalities for Activity Recognition," Proc. Ninth Int'l Conf. Ubiquitous Computing, pp. 483-500, 2007.
[23] F.T. Keat, S. Ranganath, and Y.V. Venkatesh, "Eye Gaze Based Reading Detection," Proc. IEEE Conf. Convergent Technologies for the Asia-Pacific Region, vol. 2, pp. 825-828, 2003.
[24] M. Brown, M. Marmor, and Vaegan, "ISCEV Standard for Clinical Electro-Oculography (EOG)," Documenta Ophthalmologica, vol. 113, no. 3, pp. 205-212, 2006.
[25] A.T. Duchowski, Eye Tracking Methodology: Theory and Practice. Springer-Verlag New York, Inc., 2007.
[26] B.R. Manor and E. Gordon, "Defining the Temporal Threshold for Ocular Fixation in Free-Viewing Visuocognitive Tasks," J. Neuroscience Methods, vol. 128, nos. 1/2, pp. 85-93, 2003.
[27] C.N. Karson, K.F. Berman, E.F. Donnelly, W.B. Mendelson, J.E. Kleinman, and R.J. Wyatt, "Speaking, Thinking, and Blinking," Psychiatry Research, vol. 5, no. 3, pp. 243-246, 1981.
[28] R. Schleicher, N. Galley, S. Briest, and L. Galley, "Blinks and Saccades as Indicators of Fatigue in Sleepiness Warnings: Looking Tired?" Ergonomics, vol. 51, no. 7, pp. 982-1010, 2008.
[29] H.R. Schiffman, Sensation and Perception: An Integrated Approach, fifth ed. John Wiley & Sons, 2001.
[30] J.J. Gu, M. Meng, A. Cook, and G. Faulkner, "A Study of Natural Eye Movement Detection and Ocular Implant Movement Control Using Processed EOG Signals," Proc. IEEE Int'l Conf. Robotics and Automation, vol. 2, pp. 1555-1560, 2001.
[31] N. Pan, V.M. I, M.P. Un, and P.S. Hang, "Accurate Removal of Baseline Wander in ECG Using Empirical Mode Decomposition," Proc. Joint Meeting Sixth Int'l Symp. Noninvasive Functional Source Imaging of the Brain and Heart and the Int'l Conf. Functional Biomedical Imaging, pp. 177-180, 2007.
[32] V.S. Chouhan and S.S. Mehta, "Total Removal of Baseline Drift from ECG Signal," Proc. 17th Int'l Conf. Computer Theory and Applications, pp. 512-515, 2007.
[33] L. Xu, D. Zhang, and K. Wang, "Wavelet-Based Cascaded Adaptive Filter for Removing Baseline Drift in Pulse Waveforms," IEEE Trans. Biomedical Eng., vol. 52, no. 11, pp. 1973-1975, Nov. 2005.
[34] M.A. Tinati and B. Mozaffary, "A Wavelet Packets Approach to Electrocardiograph Baseline Drift Cancellation," Int'l J. Biomedical Imaging, vol. 2006, pp. 1-9, 2006.
[35] D.L. Donoho, "De-Noising by Soft-Thresholding," IEEE Trans. Information Theory, vol. 41, no. 3, pp. 613-627, May 1995.
[36] D.D. Salvucci and J.H. Goldberg, "Identifying Fixations and Saccades in Eye-Tracking Protocols," Proc. Symp. Eye Tracking Research & Applications, pp. 71-78, 2000.
[37] H. Peng, F. Long, and C. Ding, "Feature Selection Based on Mutual Information Criteria of Max-Dependency, Max-Relevance, and Min-Redundancy," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 27, no. 8, pp. 1226-1238, Aug. 2005.
[38] H. Peng, "mRMR Feature Selection Toolbox for MATLAB," http://research.janelia.org/peng/projmRMR /, Feb. 2008.
[39] K. Crammer and Y. Singer, "Ultraconservative Online Algorithms for Multiclass Problems," J. Machine Learning Research, vol. 3, pp. 951-991, 2003.
[40] C.-J. Lin, "LIBLINEAR—A Library for Large Linear Classification," http://www.csie.ntu.edu.tw/~cjlinliblinear /, Feb. 2008.
[41] D. Bannach, P. Lukowicz, and O. Amft, "Rapid Prototyping of Activity Recognition Applications," IEEE Pervasive Computing, vol. 7, no. 2, pp. 22-31, Apr.-June 2008.
[42] R.L. Canosa, "Real-World Vision: Selective Perception and Task," ACM Trans. Applied Perception, vol. 6, no. 2, pp. 1-34, 2009.
[43] D. Palomba, M. Sarlo, A. Angrilli, A. Mini, and L. Stegagno, "Cardiac Responses Associated with Affective Processing of Unpleasant Film Stimuli," Int'l J. Psychophysiology, vol. 36, no. 1, pp. 45-57, 2000.
[44] A. Bulling, D. Roggen, and G. Tröster, "It's in Your Eyes— Towards Context-Awareness and Mobile HCI Using Wearable EOG Goggles," Proc. 10th Int'l Conf. Ubiquitous Computing, pp. 84-93, 2008.
[45] T. Huynh, M. Fritz, and B. Schiele, "Discovery of Activity Patterns Using Topic Models," Proc. 10th Int'l Conf. Ubiquitous Computing, pp. 10-19, 2008.
[46] A. Bulling, D. Roggen, and G. Tröster, "What's in the Eyes for Context-Awareness?" IEEE Pervasive Computing, 2010, doi:10.1109/MPRV.2010.49.
6 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool