The Community for Technology Leaders
RSS Icon
Issue No.03 - July-September (2012 vol.3)
pp: 298-310
Omar AlZoubi , The University of Sydney, Sydney
Sidney K. D'Mello , The University of Notre Dame, Notre Dame
Rafael A. Calvo , University of Sydney, Sydney
Signals from peripheral physiology (e.g., ECG, EMG, and GSR) in conjunction with machine learning techniques can be used for the automatic detection of affective states. The affect detector can be user-independent, where it is expected to generalize to novel users, or user-dependent, where it is tailored to a specific user. Previous studies have reported some success in detecting affect from physiological signals, but much of the work has focused on induced affect or acted expressions instead of contextually constrained spontaneous expressions of affect. This study addresses these issues by developing and evaluating user-independent and user-dependent physiology-based detectors of nonbasic affective states (e.g., boredom, confusion, curiosity) that were trained and validated on naturalistic data collected during interactions between 27 students and AutoTutor, an intelligent tutoring system with conversational dialogues. There is also no consensus on which techniques (i.e., feature selection or classification methods) work best for this type of data. Therefore, this study also evaluates the efficacy of affect detection using a host of feature selection and classification techniques on three physiological signals (ECG, EMG, and GSR) and their combinations. Two feature selection methods and nine classifiers were applied to the problem of recognizing eight affective states (boredom, confusion, curiosity, delight, flow/-engagement, surprise, and neutral). The results indicated that the user-independent modeling approach was not feasible; however, a mean kappa score of 0.25 was obtained for user-dependent models that discriminated among the most frequent emotions. The results also indicated that k-nearest neighbor and Linear Bayes Normal Classifier (LBNC) classifiers yielded the best affect detection rates. Single channel ECG, EMG, and GSR and three-channel multimodal models were generally more diagnostic than two--channel models.
Physiology, Electromyography, Electrocardiography, Biomedical monitoring, Detectors, Heart rate variability, naturalistic emotions, Emotion, AutoTutor, physiological signals, user-independent, user-dependent
Omar AlZoubi, Sidney K. D'Mello, Rafael A. Calvo, "Detecting Naturalistic Expressions of Nonbasic Affect Using Physiological Signals", IEEE Transactions on Affective Computing, vol.3, no. 3, pp. 298-310, July-September 2012, doi:10.1109/T-AFFC.2012.4
[1] R.W. Picard, Affective Computing, second ed. The MIT Press, 1997.
[2] M. Pantic, A. Nijholt, A. Pentland, and T.S. Huanag, “Human-Centred Intelligent Human-Computer Interaction (${\rm HCI}^2$ ): How Far Are We from Attaining It?” Int'l J. Autonomous and Adaptive Comm. Systems, vol. 1, pp. 168-187, 2008.
[3] J.H. Tao and T.N. Tan, “Affective Computing: A Review,” Proc. Affective Computing and Intelligent Interaction, pp. 981-995, 2005.
[4] R.A. Calvo and S. D'Mello, “Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications,” IEEE Trans. Affective Computing, vol. 1, no. 1, pp. 18-37, Jan.-June 2010.
[5] Z.H. Zeng, M. Pantic, G.I. Roisman, and T.S. Huang, “A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 31, no. 1, pp. 39-58, Jan. 2009.
[6] J. Kim and E. Andre, “Emotion Recognition Based on Physiological Changes in Music Listening,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 30, no. 12, pp. 2067-2083, Dec. 2008.
[7] K. Kim, S. Bang, and S. Kim, “Emotion Recognition System Using Short-Term Monitoring of Physiological Signals,” Medical and Biological Eng. and Computing, vol. 42, pp. 419-427, 2004.
[8] R.W. Picard, E. Vyzas, and J. Healey, “Toward Machine Emotional Intelligence: Analysis of Affective Physiological State,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 23, no. 10, pp. 1175-1191, Oct. 2001.
[9] J. Kim, “Bimodal Emotion Recognition Using Speech and Physiological Changes,” Robust Speech Recognition and Understanding, M. Grimm and K. Kroschel, eds., pp. 265-280, I-Tech Education and Publishing, 2007.
[10] G. Gargiulo, P. Bifulco, A. McEwan, J. Nasehi Tehrani, R.A. Calvo, M. Romano, M. Ruffo, R. Shephard, M. Cesarelli, and C. Jin, “Dry Electrode Bio-Potential Recordings,” Proc. IEEE Int'l Conf. Eng. Medicine and Biology Soc., pp. 6493-6496, 2010.
[11] P.A. Pour, S. Hussein, O. AlZoubi, S.K. D'Mello, and R. Calvo, “The Impact of System Feedback on Learners' Affective and Physiological States,” Proc. 10th Int'l Conf. Intelligent Tutoring Systems, pp. 264-274, 2010.
[12] J.N. Bailenson, E.D. Pontikakis, I.B. Mauss, J.J. Gross, M.E. Jabon, C.A.C. Hutcherson, C. Nass, and O. John, “Real-Time Classification of Evoked Emotions Using Facial Feature Tracking and Physiological Responses,” Int'l J. Human-Computer Studies, vol. 66, pp. 303-317, May 2008.
[13] N.H. Frijda, “The Psychologists' Point of View,” Handbook of Emotion, M. Lewis, J.M. Haviland-Jones, and L.F. Barret, eds., third ed., pp. 68-87, The Guilford Press, 2008.
[14] P. Ekman, “Moods, Emotions and Traits,” The Nature of Emotion, P. Ekman and R. J. Davidson, eds., pp. 56-58, Oxford Univ. Press, 1994.
[15] W. James, “What Is an Emotion?” Mind, vol. 9, p. 205, 188-205, 1884.
[16] C. Wassmann, “Reflections on the Body Loop: Carl Georg Lange's Theory of Emotion,” Cognition and Emotion, vol. 24, pp. 974-990, 2010.
[17] A.R. Damasio, Descartes' Error: Emotion, Reason, and the Human Brain. Grosset/Putnam, 1994.
[18] W.B. Cannon, “The James-Lange Theory of Emotions: A Critical Examination and an Alternative Theory,” The Am. J. Psychology, vol. 39, pp. 106-124, 1927.
[19] S. Schachter, “The Interaction of Cognitive and Physiological Determinants of Emotional States,” Advances in Experimental Social Psychology. vol. 1, L. Berkowitz, ed., pp. 49-79, Academic Press, 1964.
[20] B.D. Dunn, T. Dalgleish, and A.D. Lawrence, “The Somatic Marker Hypothesis: A Critical Evaluation,” Neuroscience and Biobehavioral Rev., vol. 30, pp. 239-271, 2006.
[21] R. Lazarus, Emotion and Adaptation. Oxford Univ. Press, 1991.
[22] S. Afzal and P. Robinson, “Natural Affect Data: Collection and Annotation,” New Perspectives on Affect and Learning Technologies. vol. 3, R.A. Calvo and S.K. D'Mello, eds., pp. 55-70, Springer, 2011.
[23] G. Chanel, C. Rebetez, M. Bétrancourt, and T. Pun, “Emotion Assessment from Physiological Signals for Adaptation of Game Difficulty,” IEEE Trans. Systems, Man, and Cybernetics, Part A, vol. 41, no. 6, pp. 1052-1063, Nov. 2011.
[24] C. Liu, K. Conn, N. Sarkar, and W. Stone, “Online Affect Detection and Robot Behavior Adaptation for Intervention of Children with Autism,” IEEE Trans. Robotics, vol. 24, no. 4, pp. 883-896, Aug. 2008.
[25] C. Liu, P. Agrawal, N. Sarkar, and S. Chen, “Dynamic Difficulty Adjustment in Computer Games through Real-Time Anxiety-Based Affective Feedback,” Int'l J. Human-Computer Interaction, vol. 25, pp. 506-529, 2009.
[26] J.A. Russell, “Core Affect and the Psychological Construction of Emotion,” Psychological Rev., vol. 110, pp. 145-172, 2003.
[27] J. Bachorowski and M. Owren, “Vocal Expression of Emotion— Acoustic Properties of Speech Are Associated with Emotional Intensity and Context,” Psychological Science, vol. 6, pp. 219-224, 1995.
[28] D. Keltner and J. Haidt, “Social Functions of Emotions,” Emotions: Current Issues and Future Directions, T.J. Mayne and G.A. Bonanno, eds., pp. 192-213, Guilford Press, 2001.
[29] J. Panksepp, Affective Neuroscience: The Foundations of Human and Animal Emotions. Oxford Univ. Press, 1998.
[30] A. Kappas, “Smile When You Read This, Whether You Like It or Not: Conceptual Challenges to Affect Detection,” IEEE Trans. Affective Computing, vol. 1, no. 1, pp. 38-41, Jan.-June 2010.
[31] B.T. McDaniel, S.K. D'Mello, B.G. King, P. Chipman, K. Tapp, and A.C. Graesser, “Facial Features for Affective State Detection in Learning Environments,” Proc. 29th Ann. Meeting of the Cognitive Science Soc., D.S. McNamara and J.G. Trafton, eds., pp. 467-472, 2007.
[32] F. Honig, J. Wagner, A. Batliner, and E. Noth, “Classification of User States with Physiological Signals: On-Line Generic Features vs. Specialized Feature Sets,” Proc. 17th European Signal Processing Conf., pp. 2357-2361, 2009.
[33] G. Rigas, C.D. Katsis, G. Ganiatsas, and D.I. Fotiadis, “A User Independent, Biosignal Based, Emotion Recognition Method,” Proc. 11th Int'l Conf. User Modeling, Corfu, 2007.
[34] F. Honig, A. Batliner, and E. Noth, “Real-Time Recognition of the Affective User State with Physiological Signals,” Proc. Second Int'l Conf. Affective Computing and Intelligent Interfaces, 2007.
[35] A. Lichtenstein, A. Oehme, S. Kupschick, and T. Jürgensohn, “Comparing Two Emotion Models for Deriving Affective States from Physiological Data,” Affect and Emotion in Human-Computer Interaction, C. Peter and R. Beale, eds., pp. 35-50, Springer, 2008.
[36] B. Herbelin, P. Benzaki, F. Riquier, O. Renault, and D. Thalmann, “Using Physiological Measures for Emotional Assessment: A Computer-Aided Tool for Cognitive and Behavioral Therapy,” Int'l J. Disability and Human Development, vol. 4, pp. 269-277, 2004.
[37] R. Horlings, D. Datcu, and L.J.M. Rothkrantz, “Emotion Recognition Using Brain Activity,” Proc. Ninth Int'l Conf. Computer Systems and Technologies, 2008.
[38] P. Rani and N. Sarkar, “Making Robots Emotion-Sensitive— Preliminary Experiments and Results,” Proc. IEEE Int'l Workshop Robot and Human Interactive Comm., pp. 1-6, 2005.
[39] A. Haag, S. Goronzy, P. Schaich, and J. Williams, “Emotion Recognition Using Bio-Sensors: First Steps towards an Automatic System,” Affective Dialogue Systems, pp. 36-48, Springer, 2004.
[40] G. Chanel, J. Kronegg, D. Grandjean, and T. Pun, “Emotion Assessment: Arousal Evaluation Using EEG's and Peripheral Physiological Signals,” Proc. Multimedia Content Representation, Classification and Security, pp. 530-537, 2006.
[41] R.W. Levenson, P. Ekman, and W.V. Friesen, “Voluntary Facial Action Generates Emotion-Specific Autonomic Nervous System Activity,” Psychophysiology, vol. 27, pp. 363-384, 1990.
[42] J.T. Cacioppo, G.G. Berntson, J.T. Larsen, K.M. Poehlmann, and T.A. Ito, “The Psychophysiology of Emotion,” The Handbook of Emotion, R. Lewis and J.M. Haviland-Jones, eds., second ed., pp. 173-191, Guilford Press, 2000.
[43] S.D. Kreibig, “Autonomic Nervous System Activity in Emotion: A Review,” Biological Psychology, vol. 84, pp. 394-421, 2010.
[44] D. Hagemann, S.R. Waldstein, and J.F. Thayer, “Central and Autonomic Nervous System Integration in Emotion,” Brain and Cognition, vol. 52, pp. 79-87, 2003.
[45] P. Rainville, A. Bechara, N. Naqvi, and A.R. Damasio, “Basic Emotions Are Associated with Distinct Patterns of Cardiorespiratory Activity,” Int'l J. Psychophysiology, vol. 61, pp. 5-18, 2006.
[46] H. Lee, A. Shackman, D. Jackson, and R. Davidson, “Test-Retest Reliability of Voluntary Emotion Regulation,” Psychophysiology, vol. 46, pp. 874-879, 2009.
[47] J. Healey and R. Picard, “Digital Processing of Affective Signals,” Proc. IEEE Int'l Conf. Acoustics, Speech and Signal Processing, pp. 3749-3752, 1998.
[48] J.L. Andreassi, Psychophysiology: Human Behavior and Physiological Response, fifth ed. Lawrence Erlbaum Assoc., 2007.
[49] J. Kim, E. Andre, and T. Vogt, “Towards User-Independent Classification of Multimodal Emotional Signals,” Proc. Third Int'l Conf. Affective Computing and Intelligent Interaction and Workshops, pp. 1-7, 2009.
[50] J. Kim, N. Bee, J. Wagner, and E. André, “Emote to Win: Affective Interactions with a Computer Game Agent,” GI Jahrestagung, vol. 1, pp. 159-164, 2004.
[51] S.K. D'Mello, “A Meta-Analysis on the Incidence of Emotions during Complex Learning,” in preparation.
[52] A. Graesser, P. Chipman, B. Haynes, and A. Olney, “Autotutor: An Intelligent Tutoring System with Mixed-Initiative Dialogue,” IEEE Trans. Education, vol. 48, no. 4, pp. 612-618, Nov. 2005.
[53] R. Baker, S. D'Mello, M. Rodrigo, and A. Graesser, “Better to Be Frustrated Than Bored: The Incidence and Persistence of Affect during Interactions with Three Different Computer-Based Learning Environments,” Int'l J. Human-Computer Studies, vol. 68, pp. 223-241, 2010.
[54] S.K. D'Mello and A.C. Graesser, “Emotions during Learning with Autotutor,” Adaptive Technologies for Training and Education, P. Durlach and A. Lesgold, eds. Cambridge Univ. Press, 2012.
[55] E. Rosenberg and P. Ekman, “Coherence between Expressive and Experiential Systems in Emotion,” Cognition and Emotion, vol. 8, pp. 201-229, May 1994.
[56] A. Graesser and P. Chipman, “Detection of Emotions during Learning with Autotutor,” Proc. 28th Ann. Meetings of the Cognitive Science Soc., pp. 285-290, 2006.
[57] S. Craig, A. Graesser, J. Sullins, and J. Gholson, “Affect and Learning: An Exploratory Look into the Role of Affect in Learning,” J. Educational Media, vol. 29, pp. 241-250, 2004.
[58] S. Craig, S. D'Mello, A. Witherspoon, and A. Graesser, “Emote Aloud during Learning with Autotutor: Applying the Facial Action Coding System to Cognitive-Affective States during Learning,” Cognition and Emotion, vol. 22, pp. 777-788, 2008.
[59] S. D'Mello and A. Graesser, “Multimodal Semi-Automated Affect Detection from Conversational Cues, Gross Body Language, and Facial Features,” User Modeling and User-Adapted Interaction, vol. 20, pp. 147-187, 2010.
[60] J. Wagner, J. Kim, and E. Andre, “From Physiological Signals to Emotions: Implementing and Comparing Selected Methods for Feature Extraction and Classification,” Proc. IEEE Int'l Conf. Multimedia and Expo, pp. 940-943, 2005.
[61] F.v.d. Heijden, R.P. Duin, D.d. Ridder, and D.M. Tax, Classification, Parameter Estimation and State Estimation—An Engineering Approach Using Matlab. John Wiley & Sons, 2004.
[62] C.-C. Chang and C.-J. Lin, “Libsvm: A Library for Support Vector Machines,” http://Www.Csie.Ntu.Edu.Tw/~CjlinLibsvm, 2001.
[63] I.H. Witten and E. Frank, Data Mining: Practical Machine Learning Tools and Techniques, second ed. Morgan Kaufmann, 2005.
[64] J.A. Cohen, “Coefficient of Agreement for Nominal Scales,” Educational and Psychological Measurement, vol. 20, pp. 37-46, 1960.
[65] S. García, A. Fernández, J. Luengo, and F. Herrera, “A Study of Statistical Techniques and Performance Measures for Genetics-Based Machine Learning: Accuracy and Interpretability,” Software Computing, vol. 13, pp. 959-977, 2009.
[66] M. Fatourechi, R.K. Ward, S.G. Mason, J. Huggins, A. Schlogl, and G.E. Birch, “Comparison of Evaluation Metrics in Classification Applications with Imbalanced Datasets,” Proc. Seventh Int'l Conf. Machine Learning and Applications, pp. 777-782, 2008.
[67] H. He and E. Garcia, “Learning from Imbalanced Data,” IEEE Trans. Knowledge and Data Eng., vol. 21, no. 9, pp. 1263-1284, Sept. 2009.
[68] S. D'Mello, S. Craig, A. Witherspoon, B. McDaniel, and A. Graesser, “Automatic Detection of Learner's Affect from Conversational Cues,” User Modeling and User-Adapted Interaction, vol. 18, pp. 45-80, 2008.
[69] S. D'Mello and A. Graesser, “Automatic Detection of Learner's Affect from Gross Body Language,” Applied Artificial Intelligence, vol. 23, pp. 123-150, 2009.
[70] S. D'Mello, R.W. Picard, and A. Graesser, “Toward an Affect-Sensitive Autotutor,” IEEE Intelligent Systems, vol. 22, no. 4, pp. 53-61, July/Aug. 2007.
[71] B. Kort, R. Reilly, and R. Picard, “An Affective Model of Interplay between Emotions and Learning: Reengineering Educational Pedagogy-Building a Learning Companion,” Proc. Int'l Conf. Advanced Learning Technologies, 2001.
[72] J.A. Russell, “A Circumplex Model of Affect,” J. Personality and Social Psychology, vol. 39, pp. 1161-1178, 1980.
[73] M.S. Hussain, O. AlZoubi, R.A. Calvo, and S.K. D'Mello, “Affect Detection from Multichannel Physiology During Learning Sessions with Autotutor,” Proc. Artificial Intelligence in Education, 2011.
[74] A.K. Jain, R.P.W. Duin, and M. Jianchang, “Statistical Pattern Recognition: A Review,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 1, pp. 4-37, Jan. 2000.
[75] R.O. Duda, P.E. Hart, and D.G. Stork, Pattern Classification. John Wiley & Sons, 2001.
[76] H.R. Lindman, Analysis of Variance in Complex Experimental Designs. W.H. Freeman & Co, 1974.
[77] J.H. McDonald, Handbook of Biological Statistics, second ed. Sparky House, 2009.
[78] O. Villon and C. Lisetti, “Toward Recognizing Individual's Subjective Emotion from Physiological Signals in Practical Application,” Proc. IEEE 20th Int'l Symp. Computer-Based Medical Systems, 2007.
[79] M. Bradley and P.J. Lang, “The International Affective Picture System (Iaps) in the Study of Emotion and Attention,” Handbook of Emotion Elicitation and Assessment, J.A. Coan and J.J.B. Allen, eds., pp. 29-46, Oxford Univ. Press, 2007.
[80] H. Ellgring and B. Rimé, “Individual Differences in Emotional Reactions,” Experiencing Emotion: A Cross-Cultural Study, K.R. Scherer, H.G. Wallbott, and A.B. Summerfield, eds., pp. 142-153, Cambridge Univ. Press, 1986.
[81] J.B. Asendorpf and K.R. Scherer, “The Discrepant Repressor: Differentiation of Coping Style by Autonomic-Facial-Verbal Behavior Patterns,” J. Personality and Social Psychology, vol. 45, pp. 1334-1346, 1983.
[82] E. Rosenberg, “Levels of Analysis and the Organization of Affect,” Rev. General Psychology, vol. 2, pp. 247-270, 1998.
[83] G. Govaret, Data Analysis. ISTE Ltd, 2009.
[84] S. D'Mello, B. Lehman, J. Sullins, R. Daigle, R. Combs, K. Vogt, L. Perkins, and A. Graesser, “A Time for Emoting: When Affect-Sensitivity Is and Isn't Effective at Promoting Deep Learning,” Intelligent Tutoring Systems, V. Aleven, J. Kay, and J. Mostow, eds., pp. 245-254, Springer, 2010.
[85] R. Calvo and S.E. D'Mello, New Perspectives on Affect and Learning Technologies. Springer, 2011.
[86] A. De Vicente and H. Pain, “Informing the Detection of Students' Motivational State: An Empirical Study,” Proc. Intelligent Tutoring Systems, pp. 933-943, 2002.
[87] T.d. Soldato and B.d. Boulay, “Implementation of Motivational Tactics in Tutoring Systems,” J. Artificial Intelligence in Education, vol. 6, pp. 337-378, 1995.
[88] B.d. Boulay, “Towards a Motivationally Intelligent Pedagogy: How Should an Intelligent Tutor Respond to the Unmotivated or the Demotivated?” New Perspectives on Affect and Learning Technologies, vol. 3, R.A. Calvo and S.K. D'Mello, eds., pp. 41-52, Springer, 2011.
[89] S.K. D'Mello, S.D. Craig, B. Gholson, and S. Franklin, “Integrating Affect Sensors in an Intelligent Tutoring System,” Proc. Affective Interactions: The Computer in the Affective Loop Workshop at Int'l. Conf. Intelligent User Interfaces, pp. 7-13, 2005.
[90] J. Kim, E. Andre, M. Rehm, T. Vogt, and J. Wagner, “Integrating Information from Speech and Physiological Signals to Achieve Emotional Sensitivity,” Proc. Ninth European Conf. Speech Comm. and Technology, 2005.
[91] H. Gunes, M. Piccardi, and M. Pantic, “From the Lab to the Real World: Affect Recognition Using Multiple Cues and Modalities,” Affective Computing, Focus on Emotion Expression, Synthesis and Recognition, J. Or, ed., pp. 185-218, I-Tech Education and Publishing, 2008.
35 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool