The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.03 - July-September (2011 vol.2)
pp: 134-146
R. Niewiadomski , Telecom ParisTech, Paris, France
S. J. Hyniewska , Telecom ParisTech, Paris, France
C. Pelachaud , Telecom ParisTech, Paris, France
ABSTRACT
Emotional expressions play a very important role in the interaction between virtual agents and human users. In this paper, we present a new constraint-based approach to the generation of multimodal emotional displays. The displays generated with our method are not limited to the face, but are composed of different signals partially ordered in time and belonging to different modalities. We also describe the evaluation of the main features of our approach. We examine the role of multimodality, sequentiality, and constraints in the perception of synthesized emotional states. The results of our evaluation show that applying our algorithm improves the communication of a large spectrum of emotional states, while the believability of the agent animations increases with the use of constraints over the multimodal signals.
INDEX TERMS
virtual reality, computer animation, emotion recognition, graphical user interfaces, human computer interaction, multimodal sequential expression, virtual agents, human users, constraint based model, multimodal emotional displays, emotional expression, synthesized emotional states, agent animation, multimodal signals, Animation, Face recognition, Hidden Markov models, Videos, Games, Emotion recognition, Heuristic algorithms, virtual realities., Graphical user interfaces, artificial, augmented
CITATION
R. Niewiadomski, S. J. Hyniewska, C. Pelachaud, "Constraint-Based Model for Synthesis of Multimodal Sequential Expressions of Emotions", IEEE Transactions on Affective Computing, vol.2, no. 3, pp. 134-146, July-September 2011, doi:10.1109/T-AFFC.2011.5
REFERENCES
[1] P. Ekman and W. Friesen, Unmasking the Face: A Guide to Recognizing Emotions from Facial Clues. Prentice-Hall, Inc., 1975.
[2] B. Jung, “Flurmax: An Interactive Virtual Agent for Entertaining Visitors in a Hallway,” Proc. Fourth Int'l Workshop Intelligent Agents, pp. 23-26, 2003.
[3] M. Ochs, C. Pelachaud, and D. Sadek, “An Empathic Rational Dialog Agent,” Proc. Second Int'l Conf. Affective Computing and Intelligent Interaction, pp. 338-349, 2007.
[4] X. Pan, M. Gillies, T.M. Sezgin, and C. Loscos, “Expressing Complex Mental States through Facial Expressions,” Proc. Second Int'l Conf. Affective Computing and Intelligent Interaction, pp. 745-746, 2007.
[5] B. Lance and S. Marsella, “Emotionally Expressive Head and Body Movements during Gaze Shifts.” Proc. Seventh Int'l Conf. Intelligent Virtual Agents, pp. 72-85, 2007.
[6] J. Haidt and D. Keltner, “Culture and Facial Expression: Open-Ended Methods Find More Expressions and a Gradient of Recognition,” Cognition and Emotion, vol. 13, no. 3, pp. 225-266, 1999.
[7] D. Keltner, “Signs of Appeasement: Evidence for the Distinct Displays of Embarrassment, Amusement, and Shame,” J. Personality and Social Psychology, vol. 68, pp. 441-454, 1995.
[8] H. Wallbott, “Bodily Expression of Emotion,” European J. Social Psychology, vol. 28, pp. 879-896, 1998.
[9] F. Pollick, H. Paterson, A. Bruderlin, and A. Sanford, “Perceiving Affect from Arm Movement,” Cognition, vol. 82, pp. 51-61, 2001.
[10] M.N. Shiota, B. Campos, and D. Keltner, “The Faces of Positive Emotion: Prototype Displays of Awe, Amusement, and Pride,” Annals of the New York Academy of Sciences, vol. 1000, pp. 296-299, 2003.
[11] K.R. Scherer and H. Ellgring, “Are Facial Expressions of Emotion Produced by Categorical Affect Programs or Dynamically Driven by Appraisal?” Emotion, vol. 7, pp. 113-130, 2007.
[12] H.K.M. Meeren, C.C.R.J. van Heijnsbergen, and B. de Gelder, “Rapid Perceptual Integration of Facial Expression and Emotional Body Language,” Proc. Nat'l Academy of Sciences USA, vol. 102, no. 45, pp. 16 518-23, 2005.
[13] J.A. Harrigan and D.M. O'Connell, “Facial Movements during Anxiety States,” Personality and Individual Differences, vol. 21, pp. 205-211, 1996.
[14] R.J. Edelmann and S.E. Hampson, “The Recognition of Embarrassment,” Personality and Social Psychology Bull., vol. 7, no. 1, pp. 109-116, 1981.
[15] J.A. Harrigan and D.M. O'Connell, “How Do You Look When Feeling Anxious? Facial Displays of Anxiety,” Personality and Individual Differences, vol. 21, pp. 205-212, 1996.
[16] P. Rozin and A. Cohen, “High Frequency of Facial Expressions Corresponding to Confusion, Concentration, and Worry in an Analysis of Naturally Occurring Facial Expressions of Americans,” Emotion, vol. 3, no. 1, pp. 68-75, 2003.
[17] Z. Ruttkay, “Constraint-Based Facial Animation,” Int'l J. Constraints, vol. 6, pp. 85-113, 2001.
[18] N. Stoiber, R. Seguier, and G. Breton, “Automatic Design of a Control Interface for a Synthetic Face,” Proc. Int'l Conf. Intelligent User Interfaces, C. Conati, M. Bauer, N. Oliver, and D.S. Weld, eds., pp. 207-216, 2009.
[19] M. Paleari and C. Lisetti, “Psychologically Grounded Avatars Expressions,” Proc. First Workshop Emotion and Computing at KI 2006, 29th Ann. Conf. Artificial Intelligence, 2006.
[20] L. Malatesta, A. Raouzaiou, K. Karpouzis, and S.D. Kollias, “Towards Modeling Embodied Conversational Agent Character Profiles Using Appraisal Theory Predictions in Expression Synthesis,” Applied Intelligence, vol. 30, no. 1, pp. 58-64, 2009.
[21] C. Clavel, J. Plessier, J.-C. Martin, L. Ach, and B. Morel, “Combining Facial and Postural Expressions of Emotions in a Virtual Character,” Proc. Ninth Int'l Conf. Intelligent Virtual Agents, pp. 287-300, 2009.
[22] N. Mana and F. Pianesi, “HMM-Based Synthesis of Emotional Facial Expressions during Speech in Synthetic Talking Heads,” Proc. Eighth Int'l Conf. Multimodal Interfaces, F.K.H. Quek, J. Yang, D.W. Massaro, A.A. Alwan, and T.J. Hazen, eds., pp. 380-387, 2006.
[23] S. Abrilian, L. Devillers, S. Buisine, and J.-C. Martin, “EmoTV1: Annotation of Real-Life Emotions for the Specification of Multimodal Affective Interfaces,” Proc. 11th Int'l Conf. Human-Computer Interaction, 2005.
[24] E. Douglas-Cowie, N. Campbell, and P. Roach, “Emotional Speech: Towards a New Generation of Databases,” Speech Comm., vol. 40, nos. 1/2, pp. 33-60, 2003.
[25] E. Douglas-Cowie, C. Cox, J.-C. Martin, L. Devillers, R. Cowie, I. Sneddon, M. McRorie, C. Pelachaud, C. Peters, O. Lowry, A. Batliner, and F. Honig, “Humaine Database,” http://emotion-research.net/downloadpilot-db /, 2011.
[26] P. Ekman, W.V. Friesen, and J.C. Hager, Facial Action Coding System: The Manual. Consulting Psychologists Press, 2002.
[27] R. Niewiadomski, S. Hyniewska, and C. Pelachaud, “Evaluation of Multimodal Sequential Expressions of Emotions in ECA,” Proc. Int'l Conf. Affective Computing and Intelligent Interaction, 2009.
[28] J.F. Allen, “Maintaining Knowledge about Temporal Intervals,” Comm. ACM, vol. 26, pp. 832-843, Nov. 1983.
[29] J.F. Allen and J.A. Koomen, “Planning Using a Temporal World Model,” Proc. Eighth Int'l Joint Conf. Artificial Intelligence, vol. 2, pp. 741-747, 1983.
[30] D.E. Smith and D.S. Weld, “Temporal Planning with Mutual Exclusion Reasoning,” Proc. 16th Int'l Joint Conf. Artificial Intelligence, pp. 326-333, 1999.
[31] P. Ekman, “Darwin, Deception, and Facial Expression,” Annals of the New York Academy of Sciences, vol. 1000, pp. 205-221, 2003.
[32] R. Niewiadomski, E. Bevacqua, M. Mancini, and C. Pelachaud, “Greta: An Interactive Expressive ECA System,” Proc. Eighth Int'l Joint Conf. Autonomous Agents and Multiagent Systems, vol. 2, pp. 1399-1400, 2009.
[33] J.A. Russell, “Is There Universal Recognition of Emotion from Facial Expression? A Review of the Cross-Cultural Studies,” Psychological Bull., vol. 115, pp. 102-141, 1994.
[34] D.M. Isaacowitz, C.E. Löckenhoff, R.D. Lane, R. Wright, L. Sechrest, R. Riedel, and P.T. Costa, “Age Differences in Recognition of Emotion in Lexical Stimuli and Facial Expressions,” Psychology and Aging, vol. 22, pp. 147-159, Mar. 2007.
[35] U. Hess, R.B.J. Adams, and R.E. Kleck, “Facial Appearance, Gender, and Emotion Expression,” Emotion, vol. 4, pp. 378-388, 2004.
[36] A. Kleinsmith and N. Bianchi-Berthouze, “Recognizing Affective Dimensions from Body Posture,” Proc. Second Int'l Conf. Affective Computing and Intelligent Interaction, pp. 48-58, 2007.
20 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool