The Community for Technology Leaders
RSS Icon
Issue No.03 - July-September (2012 vol.3)
pp: 260-272
Christos N. Moridis , University of Macedonia, Thessaloniki
Anastasios A. Economides , University of Macedonia, Thessaloniki
Empathetic behavior has been suggested to be one effective way for Embodied Conversational Agents (ECAs) to provide feedback to learners' emotions. An issue that has been raised is the effective integration of parallel and reactive empathy. The aim of this study is to examine the impact of ECAs' emotional facial and tone of voice expressions combined with empathetic verbal behavior when displayed as feedback to students' fear, sad, and happy emotions in the context of a self-assessment test. Three identical female agents were used for this experiment: 1) an ECA performing parallel empathy combined with neutral emotional expressions, 2) an ECA performing parallel empathy displaying emotional expressions that were relevant to the emotional state of the student, and 3) an ECA performing parallel empathy by displaying relevant emotional expressions followed by emotional expressions of reactive empathy with the goal of altering the student's emotional state. Results indicate that an agent performing parallel empathy displaying emotional expressions relevant to the emotional state of the student may cause this emotion to persist. Moreover, the agent performing parallel and then reactive empathy appeared to be effective in altering an emotional state of fear to a neutral one.
Context, Humans, Synchronization, Speech, Computers, Emotion recognition, Avatars, user interfaces, Computers and education, intelligent agents, empathy
Christos N. Moridis, Anastasios A. Economides, "Affective Learning: Empathetic Agents with Emotional Facial and Tone of Voice Expressions", IEEE Transactions on Affective Computing, vol.3, no. 3, pp. 260-272, July-September 2012, doi:10.1109/T-AFFC.2012.6
[1] J.N. Bailenson, N. Yee, D. Merget, and R. Schroeder, “The Effect of Behavioral Realism and Form Realism of Real-Time Avatar Faces on Verbal Disclosure, Nonverbal Disclosure, Emotion Recognition, and Copresence in Dyadic Interaction,” Presence: Teleoperators and Virtual Environments, vol. 15, pp. 359-372, 2006.
[2] J. Cassell and P. Miller, “Is It Self-Administration If the Computer Gives You Encouraging Looks?” Envisioning the Survey Interview of the Future, F.G. Conrad and M.F. Schober, eds., pp. 161-178, John Wiley & Sons, 2007.
[3] C.N. Moridis and A.A. Economides, “Towards Computer-Aided Affective Learning Systems: A Literature Review,” J. Educational Computing Research, vol. 39, no. 4, pp. 313-337, 2008.
[4] C.R. Rogers, “A Theory of Therapy, Personality and Interpersonal Relationships, as Developed in the Client-Centered Framework,” Psychology: A Study of Science, S. Koch, ed. McGraw-Hill, 1959.
[5] C. Nass and Y. Moon, “Machines and Mindlessness: Social Responses to Computers,” J. Social Issues, vol. 56, no. 1, pp. 81-103, 2000.
[6] B. Reeves and Nass, The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places. Cambridge Univ. Press, 1996.
[7] S. Brave, C. Nass, and K. Hutchinson, “Computers that Care: Investigating the Effects of Orientation of Emotion Exhibited by an Embodied Computer Agent,” Int'l J. Human-Computer Studies, vol. 62, no. 2, pp. 161-178, 2005.
[8] D.M. Dehn and S. Van Mulder, “The Impact of Animated Interface Agents: A Review of Empirical Research,” Int'l J. Human-Computer Studies, vol. 52, no. 1, pp. 1-22, 2000.
[9] M.H. Davis, Empathy: A Social Psychological Approach. Westview Press, 1996.
[10] M.Z. Yusoff and B. Du Boulay, “The Integration of Domain-Independent Strategies into an Affective Tutoring System: Can Students' Learning Gain Be Improved?” Electronic J. Computer Science & Information Technology, vol. 1, no. 1, 2009.
[11] C. Achebe, “Multi-Modal Counselling for Examination Failure in a Nigerian University: A Case Study,” J. African Studies, vol. 9, pp. 187-193, 1982.
[12] A. Efklides and S. Volet, “Feelings and Emotions in the Learning Process,” Learning and Instruction, vol. 15, no. 5, pp. 1-10, 2005.
[13] A.A. Economides, “Personalized Feedback in CAT (Computer Adaptive Testing),” Trans. Advances in Eng. Education, vol. 2, no. 3, pp. 174-181, 2005.
[14] S. Oviatt, “User-Modeling and Evaluation of Multimodal Interfaces,” Proc. IEEE, special issue on human-computer multimodal interface, vol. 91, no. 9, pp. 1457-1468, Sept. 2003.
[15] M. Pantic and L.J.M. Rothkrantz, “Toward an Affect-Sensitive Multimodal Human-Computer Interaction,” Proc. IEEE, special issue on human-computer multimodal interface, vol. 91, no. 9, pp. 1370-1390, Sept. 2003.
[16] M.J. Den Uyl and H. van Kuilenburg, “The FaceReader: Online Facial Expression Recognition,” Proc. Measuring Behaviour, pp. 589-590, 2005.
[17] P. Ekman and W.V. Friesen, Manual for the Facial Action Coding System. Consulting Psychologists Press, 1977.
[18] C. Conati, “Probabilistic Assessment of User's Emotions during the Interaction with Educational Games,” J. Applied Artificial Intelligence, special issue on merging cognition and affect in HCI, vol. 16, pp. 555-575, 2002.
[19] C. Conati and H. Maclaren, “Empirically Building and Evaluating a Probabilistic Model of User Affect,” User Modeling and User-Adapted Interaction, vol. 19, pp. 267-303, 2009.
[20] A. Ortony, G.L. Clore, and A. Collins, The Cognitive Structure of Emotions. Cambridge Univ. Press, 1988.
[21] R.A. Calvo and S.D. Mello, “Affect Detection: An Interdisciplinary Review of Models, Methods and Their Applications,” IEEE Trans. Affective Computing, vol. 1, no. 1, pp. 18-37, Jan.-June 2010.
[22] S. Craig, A. Graesser, J. Sullins, and B. Gholson, “Affect and Learning: An Exploratory Look into the Role of Affect in Learning with AutoTutor,” Learning, Media and Technology, vol. 29, pp. 241-250, 2004.
[23] B. Woolf, W. Burleson, I. Arroyo, T. Dragon, D. Cooper, and R. Picard, “Affect-Aware Tutors: Recognising and Responding to Student Affect,” Int'l J. Learning Technology, vol. 4, no. 3/4, pp. 129-164, 2009.
[24] R. Baker, S. D'Mello, M. Rodrigo, and A. Graesser, “Better to Be Frustrated than Bored: The Incidence and Persistence of Affect during Interactions with Three Different Computer-Based Learning Environments,” Int'l J. Human-Computer Studies, vol. 68, no. 4, pp. 223-241, 2010.
[25] S. D'Mello, R. Taylor, and A. Graesser, “Monitoring Affective Trajectories during Complex Learning,” Proc. 29th Ann. Cognitive Science Soc., pp. 203-208, 2007.
[26] S. D'Mello, T. Jackson, S. Craig, B. Morgan, P. Chipman, H. White, N. Person, B. Kort, R. el Kaliouby, R. Picard, and A. Graesser, “AutoTutor Detects and Responds to Learners Affective and Cognitive States,” Proc. Workshop Emotional and Cognitive Issues in ITS in Conjunction with the Ninth Int'l Conf. Intelligent Tutoring Systems, pp. 31-43, 2008.
[27] S. McQuiggan, J. Robison, and J. Lester, “Affective Transitions in Narrative-Centered Learning Environments,” Proc. Ninth Int'l Conf. Intelligent Tutoring Systems, 2008.
[28] Y. Kim, “Empathetic Virtual Peers Enhanced Learner Interest and Self-Efficacy,” Proc. Workshop Motivation and Affect in Educational Software at the 12th Int'l Conf. Artificial Intelligence in Education, pp. 9-16, 2005.
[29] J.C. Lester, S.G. Towns, and P.J. FitzGerald, “Achieving Affective Impact: Visual Emotive Communication in Lifelike Pedagogical Agents,” Int'l J. Artificial Intelligence in Education, vol. 10, nos. 3/4, pp. 278-291, 1999.
[30] K. Hone, L. Axelrod, and B. Pakekh, “Development and Evaluation of an Empathic Tutoring Agent,” Proc. Joint Symp. Virtual Social Agent: Social Intelligence and Interaction in Animals, Robots and Agents, pp. 103-108, 2005.
[31] W. Burleson and R. Picard, “Gender-Specific Approaches to Developing Emotionally Intelligent Learning Companions,” IEEE Intelligent Systems, vol. 22, no. 4, pp. 62-69, July/Aug. 2007.
[32] A.C. Graesser, P. Chipman, B.C. Haynes, and A. Olney, “AutoTutor: An Intelligent Tutoring System with Mixed-Initiative Dialogue,” IEEE Trans. Education, vol. 48, no. 4, pp. 612-618, Nov. 2005.
[33] A.C. Graesser, N. Person, D. Harter, and the Tutoring Research Group “Teaching Tactics and Dialog in AutoTutor,” Int. J. Artificial Intelligence in Education, vol. 12, pp. 257-279, 2001.
[34] A.C. Graesser, K. VanLehn, C. Rose, P. Jordan, and D. Harter, “Intelligent Tutoring Systems with Conversational Dialogue,” AI Magazine, vol. 22, pp. 39-51, 2001.
[35] A.C. Graesser, K. Wiemer-Hastings, P. Wiemer-Hastings, R. Kreuz, and the Tutoring Research Group, “AutoTutor: A Simulation of a Human Tutor,” J. Cognitive Systems Research, vol. 1, pp. 35-51, 1999.
[36] S. McQuiggan, J. Robison, R. Phillips, and J. Lester, “Modeling Parallel and Reactive Empathy in Virtual Agents: An Inductive Approach,” Proc. Seventh Int'l Joint Conf. Autonomous Agents and Multi-Agent Systems, 2008.
[37] J. Robison, S. McQuiggan, and J. Lester, “Differential Affective Experiences in Narrative-Centered Learning Environments,” Proc. Workshop Emotional and Cognitive issues in ITS in Conjunction with the Ninth Int'l Conf. Intelligent Tutoring Systems, 2008.
[38] N. Yee, J. Bailenson, and K. Rickertsen, “A Meta-Analysis of the Impact of the Inclusion and Realism of Human-Like Faces on User Experiences in Interfaces,” Proc. SIGCHI Conf. Human Factors in Computing Systems, M.B. Rosson and D. Gilmore, eds., pp. 1-10, 2007.
[39] K. Hone, “Empathic Agents to Reduce User Frustration: The Effects of Varying Agent Characteristics,” Interacting with Computers, vol. 18, no. 2, pp. 227-245, 2005.
[40] E.J. Lee, C. Nass, and S. Brave, “Can Computer-Generated Speech Have Gender? An Experimental Test of Gender Stereotype,” Proc. Conf. Human Factors in Computing Systems Extended Abstracts on Human Factors in Computing Systems, pp. 289-290, 2000.
[41] H. Prendinger and M. Ishizuka, “The Empathic Companion: A Character-Based Interface that Addresses Users' Affective States,” Int'l J. Applied Artificial Intelligence, vol. 19, pp. 285-297, 2005.
[42] M. Ochs, C. Pelachaud, and D. Sadek, “An Empathic Virtual Dialog Agent to Improve Human-Machine Interaction,” Proc. Seventh Int'l Joint Conf. Autonomous Agents and Multi-Agent Systems, 2008.
[43] R. Niewiadomski, M. Ochs, and C. Pelachaud, “Expressions of Empathy in eCAs,” Proc. 10th Int'l Conf. Intelligent Virtual Agents, H. Prendinger, J.C. Lester, and M. Ishizuka, eds., pp. 37-44, 2008.
[44] C.N. Moridis and A.A. Economides, “Prediction of Student's Mood during an Online Test Using Formula-Based and Neural Network-Based Method,” Computers & Education, vol. 53, no. 3, pp. 644-652, 2009.
[45] C.N. Moridis and A.A. Economides, “Mood Recognition during Online Self-Assessment Tests,” IEEE Trans. Learning Technologies, vol. 2, no. 1, pp. 50-61, Jan./Mar. 2009.
[46] B. Zaman and T. Shrimpton-Smith, “The FaceReader: Measuring Instant Fun of Use,” Proc. Fourth Nordic Conf. Human-Computer Interaction, pp. 457-460, 2006.
[47] H. van Kuilenburg, M. Wiering, and M. den Uyl, “A Model Based-Method for Automatic Facial Expression Recognition,” Proc. 16th European Conf. Machine Learning, 2005.
[48] R. Baker, M. Rodrigo, and U. Xolocotzin, “The Dynamics of Affective Transitions in Simulation Problem-Solving Environments,” Proc. Second Int'l Conf. Affective Computing and Intelligent Interactions, pp. 666-677, 2007.
[49] A. Paiva, J. Dias, D. Sobral, S. Woods, and L. Hall, “Building Empathic Lifelike Characters: The Proximity Factor,” Proc. Int'l Joint Conf. Autonomous Agents and Multi-Agent Systems, 2004.
[50] S.W. McQuiggan, J.L. Robison, and J.C. Lester, “Affective Transitions in Narrative-Centered Learning Environments,” Educational Technology and Soc., vol. 13, no. 1, pp. 40-53, 2010.
[51] C. Pelachaud, “Studies on Gesture Expressivity for a Virtual Agent,” Speech Comm., vol. 51, pp. 630-639, 2009.
34 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool