Issue No. 01 - January-March (2014 vol. 5)
Jacob Whitehill , Machine Perception Lab. (MPLab), Univ. of California, San Diego, La Jolla, CA, USA
Zewelanji Serpell , Dept. of Psychol., Virginia Commonwealth Univ., Richmond, VA, USA
Yi-Ching Lin , Dept. of Psychol., Virginia State Univ., Petersburg, VA, USA
Aysha Foster , Dept. of Psychol., Virginia State Univ., Petersburg, VA, USA
Javier R. Movellan , MPLab & Emotient, Inc., La Jolla, CA, USA
Student engagement is a key concept in contemporary education, where it is valued as a goal in its own right. In this paper we explore approaches for automatic recognition of engagement from students' facial expressions. We studied whether human observers can reliably judge engagement from the face; analyzed the signals observers use to make these judgments; and automated the process using machine learning. We found that human observers reliably agree when discriminating low versus high degrees of engagement (Cohen's κ = 0.96). When fine discrimination is required (four distinct levels) the reliability decreases, but is still quite high ( κ = 0.56). Furthermore, we found that engagement labels of 10-second video clips can be reliably predicted from the average labels of their constituent frames (Pearson r=0.85), suggesting that static expressions contain the bulk of the information used by observers. We used machine learning to develop automatic engagement detectors and found that for binary classification (e.g., high engagement versus low engagement), automated engagement detectors perform with comparable accuracy to humans. Finally, we show that both human and automatic engagement judgments correlate with task performance. In our experiment, student post-test performance was predicted with comparable accuracy from engagement labels ( r=0.47) as from pre-test scores ( r=0.44).
Training, Labeling, Games, Software, Tablet computers, Observers, Reliability
J. Whitehill, Z. Serpell, Yi-Ching Lin, A. Foster and J. R. Movellan, "The Faces of Engagement: Automatic Recognition of Student Engagementfrom Facial Expressions," in IEEE Transactions on Affective Computing, vol. 5, no. 1, pp. 86-98, 2014.