This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Automatically Analyzing Facial-Feature Movements to Identify Human Errors
March/April 2011 (vol. 26 no. 2)
pp. 54-63
Maria E. Jabon, Stanford University
Sun Joo Ahn, Stanford University
Jeremy N. Bailenson, Stanford University

Using facial feature points automatically extracted from short video segments, researchers couple computer vision with machine learning to predict performance over an entire task and at any given instant within the task.

1. P. Salmon et al., "Predicting Design Induced Pilot Error: A Comparison of SHERPA, Human Error HAZOP, HEIST and HET, a Newly Developed Aviation Specific HEI Method," Human-Centered Computing: Cognitive, Social, and Ergonomic Aspects, Lawrence Erlbaum Associates, vol. 3, 2003, pp. 567–571.
2. P. Ekman and E.L. Rosenberg, What the Face Reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS), Oxford Univ. Press, 1997.
3. H.I. Witten and E. Fank, Data Mining: Practical Machine Learning Tools and Techniques, 2nd ed., Morgan Kaufmann, 2005.
4. P. Kosla, A Feature Selection Approach in Problems with a Great Number of Features, Springer, 2008, pp. 394–401.
5. X. Jin et al., "Machine Learning Techniques and Chi-Square Feature Selection for Cancer Classification Using SAGE Gene Expression Profile," Proc. Data Mining for Biomedical Applications, LNBI 3916, Springer, 2006, pp. 106–115.
6. R. Kohavi, "The Power of Decision Tables," Proc. 8th European Conf. Machine Learning, Springer-Verlag, 1995, pp. 174–189.
7. J.H. Friedman, T. Hastie, and R. Tibshirani, "Additive Logistic Regression: A Statistical View of Boosting," Annals of Statistics, vol. 28, no. 2, 2000, pp. 337–407.
8. J.W. Senders and N. Moray, Human Error: Cause, Prediction, and Reduction, Lawrence Erlbaum Associates, 1991.
9. A. Isaac, "Human Error in European Air Traffic Management: The HERA project," Reliability Eng. and System Safety, vol. 75, no. 2, 2002, pp. 257–272.
1. Z. Zeng et al., "A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 31, no. 1, 2009, pp. 39–58.
2. J.N. Bailenson et al., "Real-Time Classification of Evoked Emotions Using Facial Feature Tracking and Physiological Responses," Int'l J. Human Machine Studies, vol. 66, no. 5, 2008, pp. 303–317.
3. R. Picard and J. Klein, "Computers that Recognize and Respond to User Emotion: Theoretical and Practical Implications," Interacting with Computers, vol. 14, no. 2, 2002, pp. 144–169.
4. Y.S. Shin, "Recognizing Facial Expressions with PCA and ICA onto Dimension of the Emotion," Structural, Syntactic, and Statistical Pattern Recognition, Springer, 2006, pp. 916–922.
5. R. El Kaliouby and P. Robinson, "Mind Reading Machines: Automated Inference of Cognitive Mental States from Video," Proc. IEEE Int'l Conf. Systems, Man and Cybernetics, vol. 1, IEEE Press, 2004, pp. 682–688.
6. R. Picard, Affective Computing, MIT Press, 1997.
7. R.W. Picard and K.K. Liu, "Relative Participative Count and Assessment of Interruptive Technologies Applied to Mobile Monitoring of Stress," Int'l J. Human-Computer Studies, vol. 65, no. 4, 2007, pp. 361–375.
8. A. Kapoor, W. Burleson, and R. Picard, "Automatic Prediction of Frustration," Int'l J. Human-Computer Studies, vol. 65, no. 8, 2007, pp. 724–736.
9. S. D'Mello et al., "AutoTutor Detects and Responds to Learners Affective and Cognitive States," Proc. Workshop Emotional and Cognitive Issues at Int'l Conf. Intelligent Tutoring Systems, 2008, pp. 31–43; http://affect.media.mit.edu/pdfs08.dmello-etal-autotutor.pdf .
10. M. Madsen et al., "Technology for Just-In-Time In-Situ Learning of Facial Affect for Persons Diagnosed with an Autism Spectrum Disorder," Proc. 10th ACM Conf. Computers and Accessibility (ASSETS), ACM Press, 2008, pp. 19–26.
11. T.O. Meservy et al., "Deception Detection through Automatic, Unobtrusive Analysis of Nonverbal Behavior," IEEE Intelligent Systems, vol. 20, no. 5, 2005, pp. 36–43.

Index Terms:
Intelligent systems, face and gesture recognition, video analysis, feature representation, decision support
Citation:
Maria E. Jabon, Sun Joo Ahn, Jeremy N. Bailenson, "Automatically Analyzing Facial-Feature Movements to Identify Human Errors," IEEE Intelligent Systems, vol. 26, no. 2, pp. 54-63, March-April 2011, doi:10.1109/MIS.2009.106
Usage of this product signifies your acceptance of the Terms of Use.