This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Toward an Affect-Sensitive AutoTutor
July/August 2007 (vol. 22 no. 4)
pp. 53-61
Sidney D'Mello, University of Memphis
Rosalind W. Picard, MIT Media Laboratory
Arthur Graesser, University of Memphis
Emotions (affective states) are inextricably bound to the learning process, as are cognition, motivation, discourse, action, and the environment. Augmenting an intelligent tutoring system with the ability to incorporate such states into its pedagogical strategies can improve learning. Two studies use observational and "emote aloud" protocols to identify learners' affective states as they interact with AutoTutor. A third study uses sensors to collect training and validation data during AutoTutor sessions through learners' conversational cues, gross body language, and facial expressions. By adapting its instructional strategies, an affect-sensitive AutoTutor could promote learning. This article is part of a special issue on intelligent educational systems.
Index Terms:
affect detection, emotion detection, affect-sensitive ITS, facial features, body posture, dialogue features, AutoTutor
Citation:
Sidney D'Mello, Rosalind W. Picard, Arthur Graesser, "Toward an Affect-Sensitive AutoTutor," IEEE Intelligent Systems, vol. 22, no. 4, pp. 53-61, July-Aug. 2007, doi:10.1109/MIS.2007.79
Usage of this product signifies your acceptance of the Terms of Use.