The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.02 - April-June (2013 vol.4)
pp: 151-160
S. Mohammad Mavadati , University of Denver, Denver
Mohammad H. Mahoor , University of Denver, Denver
Kevin Bartlett , University of Denver, Denver
Philip Trinh , University of Denver, Denver
Jeffrey F. Cohn , University of Pittsburgh, Pittsburgh and Carnegie Mellon University, Pittsburgh
ABSTRACT
Access to well-labeled recordings of facial expression is critical to progress in automated facial expression recognition. With few exceptions, publicly available databases are limited to posed facial behavior that can differ markedly in conformation, intensity, and timing from what occurs spontaneously. To meet the need for publicly available corpora of well-labeled video, we collected, ground-truthed, and prepared for distribution the Denver intensity of spontaneous facial action database. Twenty-seven young adults were video recorded by a stereo camera while they viewed video clips intended to elicit spontaneous emotion expression. Each video frame was manually coded for presence, absence, and intensity of facial action units according to the facial action unit coding system. Action units are the smallest visibly discriminable changes in facial action; they may occur individually and in combinations to comprise more molar facial expressions. To provide a baseline for use in future research, protocols and benchmarks for automated action unit intensity measurement are reported. Details are given for accessing the database for research in computer vision, machine learning, and affective and behavioral science.
INDEX TERMS
Databases, Gold, Encoding, Feature extraction, Face, Face recognition, Pain, video corpus, FACS, action units, intensity, spontaneous facial behavior, facial expression
CITATION
S. Mohammad Mavadati, Mohammad H. Mahoor, Kevin Bartlett, Philip Trinh, Jeffrey F. Cohn, "DISFA: A Spontaneous Facial Action Intensity Database", IEEE Transactions on Affective Computing, vol.4, no. 2, pp. 151-160, April-June 2013, doi:10.1109/T-AFFC.2013.4
REFERENCES
[1] M. Soleymani, J. Lichtenauer, T. Pun, and M. Pantic, "A Multimodal Database for Affect Recognition and Implicit Tagging, " IEEE Trans. Affective Computing, vol. 3, no. 1, pp. 42-55, Jan.-Mar. 2012.
[2] P. Ekman, W.V. Friesen, and J.C. Hager, Facial Action Coding System. A Human Face, 2002.
[3] J.J. Campos, K.C. Barrett, M.E. Lamb, H.H. Goldsmith, and C. Stenberg, "Socioemotional Development," Handbook of Child Psychology, M.M. Haith and J.J. Campos, eds., fourth ed., vol. II, pp. 783-916, Wiley, 1983.
[4] C.Z. Malatesta, C. Culver, J.R. Tesman, and B. Shephard, The Development of Emotion Expression during the First Two Years of Life (Monographs of the Soc. for Research in Child Development), vol. 54, Univ. of Chicago Press, 1989.
[5] Z. Zeng, M. Pantic, G.I. Roisman, and T.S. Huang, "A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 31, no. 1, pp. 31-58, Jan. 2009.
[6] F. De la Torre and J.F. Cohn, "Facial Expression Analysis," Visual Analysis of Humans: Looking at People, T.B. Moeslund, A. Hilton, A.U. Volker Krüger, and L. Sigal, eds., pp. 377-410, Springer-Verlag, 2011.
[7] J. Whitehill, G. Littlewort, I. Fasel, M.S. Bartlett, and J. Movellan, "Towards Practical Smile Detection," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 31, no. 11, pp. 2106-2111, Nov. 2009.
[8] P. Lucey, J.F. Cohn, J. Howlett, S. Lucey, and S. Sridharan, "Recognizing Emotion with Head Pose Variation: Identifying Pain Segments in Video," IEEE Trans. Systems, Man, and Cybernetics-Part B, vol. 41, pp. 664-674, 2011.
[9] D.S. Messinger, W.I. Mattson, M.H. Mohammad, and J.F. Cohn, "The Eyes Have It: Making Positive Expressions More Positive and Negative Expressions More Negative," Emotion, vol. 12, pp. 430-436, 2012.
[10] J.F. Cohn, "Advances in Behavioral Science Using Automated Facial Image Analysis and Synthesis," IEEE Signal Processing Magazine, vol. 27, no. 6, pp. 128-133, Nov. 2010.
[11] W.E. Rinn, "The Neuropsychology of Facial Expression," Psychological Bull., vol. 95, pp. 52-77, 1984.
[12] J.F. Cohn, Z. Ambadar, and P. Ekman, "Observer-Based Measurement of Facial Expression with the Facial Action Coding System," The Handbook of Emotion Elicitation and Assessment, J.A. Coan and J.J.B. Allen, eds., pp. 203-221, Oxford Univ. Press, 2007.
[13] K.L. Schmidt, Y. Lui, and J.F. Cohn, "The Role of Structural Facial Asymmetry in Asymmetry of Peak Facial Expressions," Laterality, vol. 11, pp. 540-561, 2006.
[14] T. Bänziger and K.R. Scherer, "Introducing the Geneva Multimodal Emotion Portrayal (GEMEP) Corpus," Blueprint for Affective Computing: A Sourcebook, pp. 271-294, Oxford Univ. Press, 2010.
[15] Social Signal Processing Network, "FG 2011 Facial Expression Recognition and Analysis Challenge (FERA2011)," http://sspnet. eufera2011/, 2013.
[16] I. Sneddon, M. McRorie, G. McKeown, and J. Hanratty, "The Belfast Induced Natural Emotion Database," IEEE Trans. Affective Computing, vol. 3, no. 1, pp. 32-41, Jan.-Mar. 2012.
[17] G. McKeown, M. Valstar, R. Cowie, M. Pantic, and M. Schröder, "The SEMAINE Database: Annotated Multimodal Records of Emotionally Colored Conversations between a Person and a Limited Agent," IEEE Trans. Affective Computing, vol. 3, no. 1, pp. 5-17, Jan.-Mar. 2012.
[18] S. Koelstra, C. Mühl, M. Soleymani, J.-S. Lee, A. Yazdani, T. Ebrahimi, T. Pun, A. Nijholt, and I. (Yiannis) Patras, "DEAP: A Database for Emotion Analysis; Using Physiological Signals," IEEE Trans. Affective Computing, vol. 3, no. 1, pp. 18-31, Jan.-Mar. 2012.
[19] D. Keltner and P. Ekman, "Facial Expression of Emotion," Handbook of Emotions, M. Lewis and J.M. Haviland, eds., second ed., pp. 236-249, Guilford, 2000.
[20] L.F. Barrett, "Was Darwin Wrong about Emotional Expressions?" Current Directions in Psychological Science, vol. 20, pp. 400-406, 2011.
[21] P. Ekman and W.V. Friesen, Facial Action Coding System (FACS). Consulting Psychologists Press, 1978.
[22] C.H. Hjortsjö, Man's Face and Mimic Language. Studentlitteratur, 1969.
[23] C. Darwin, The Expression of the Emotions in Man and Animals, third ed. Oxford Univ. Press, 1998.
[24] B. Duchenne, Mechanisme de la Physionomie Humaine; ou, Analyse Electrophysiologique de l'Expression des Passions. Bailliere, 1862.
[25] P. Ekman and E.L. Rosenberg, What the Face Reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS), second ed. Oxford Univ. Press, 2005.
[26] Denver Intensity of Spontaneous Facial Action (DISFA) Database, http://www.engr.du.edu/mmahoorDISFA.htm, 2013.
[27] S. D'Mello, R. Picard, and A. Graesser, "Towards an Affect-Sensitive Autotutor," IEEE Intelligent Systems, vol. 22, no. 4, pp. 53-61, July/Aug. 2007.
[28] J. Whitehill, M.S. Bartlett, and J. Movellan, "Automatic Facial Expression Recognition for Intelligent Tutoring Systems," Proc. IEEE Conf. Computer Vision and Pattern Recognition Workshop, 2008.
[29] K.L. Schmidt, Z. Ambadarxu, J.F. Cohn, and L.I. Reed, "Movement Differences between Deliberate and Spontaneous Facial Expressions: Zygomaticus Major Action In Smiling," J. Nonverbal Behavior, vol. 30, no. 1, pp. 37-52, 2006.
[30] A.B. Ashraf, S. Lucey, J.F. Cohn, T. Chen, Z. Ambadar, K. Prkachin, P. Solomon, and B.J. Theobald, "The Painful Face: Pain Expression Recognition Using Active Appearance Models" Proc. Int'l Conf. Multimodal Interfaces, pp. 9-14, 2007.
[31] The Emotion & Pain Project, http:/www.emo-pain.ac.uk/, 2013.
[32] M.H. Mahoor, M. Zhou, K.L. Veon, S.M. Mavadati, and J.F. Cohn, "Facial Action Recognition with Sparse Representation," Proc. Ninth IEEE Int'l Conf. Automatic Face and Gesture Recognition, Mar. 2011.
[33] Z. Yunfeng, F. De la Torre, J.F. Cohn, and Z. Yu-Jin, "Dynamic Cascades with Bidirectional Bootstrapping for Action Unit Detection in Spontaneous Facial Behavior," IEEE Trans. Affective Computing, vol. 2, no. 2, pp. 79-91, Apr.-June 2011.
[34] D.S. Messinger, M. Mahoor, S. Chow, and J.F. Cohn, "Automated Measurement of Facial Expression in Infant-Mother Interaction: A Pilot Study," Infancy, vol. 14, pp. 285-305, 2009.
[35] B. Fasel and J. Luettin, "Recognition of Asymmetric Facial Action Unit Activities and Intensities," Proc. 15th Int'l Conf. Pattern Recognition (ICPR '00), vol. 1, pp. 1100-1103, 2000.
[36] S. Kaiser and T. Wehrle, "Automated Coding of Facial Behavior in Human-Computer Interactions with FACS," J. Nonverbal Behavior, vol. 16, no. 2, pp. 67-83, 1992.
[37] A. Savran, B. Sankur, and M. Taha Bilge, "Regression-Based Intensity Estimation of Facial Action Units," Image and Vision Computing, vol 30, no. 10, pp. 774-784, 2012.
[38] M.H. Mahoor, S. Cadavid, D.S. Messinger, and J.F. Cohn, "A Framework for Automated Measurement of the Intensity of Non-Posed Facial Action Units," Proc. Second IEEE Workshop CVPR for Human Comm. Behavior Analysis, June 2009.
[39] M. Bartlett, G. Littlewort, M. Frank, C. Lainscsek, I. Fasel, and J. Movellan, "Automatic Recognition of Facial Actions in Spontaneous Expressions," J. Multimedia, vol. 1, pp. 22-35, 2006.
[40] M. Valstar, B. Jiang, M. Mehu, M. Pantic, and K. Scherer, "The First Facial Expression Recognition and Analysis Challenge," Proc. IEEE Int'l Conf. Automatic Face and Gesture Recognition Workshop, 2011.
[41] T. Kanade, J. Cohn, and Y. Tian, "Comprehensive Database for Facial Expression Analysis," Proc. IEEE Int'l Conf. Automatic Face and Gesture Recognition, pp. 46-53, 2000.
[42] M. Pantic, M. Valstar, R. Rademaker, and L. Maat, "Web-Based Database for Facial Expression Analysis," Proc. IEEE Int'l Conf. Multimedia and Expo, vol. 5, pp. 6-8, July 2005.
[43] P. Lucy, J.F. Cohn, K.M. Prkachin, P. Solomon, and I. Matthrews, "Painful Data: The UNBC-McMaster Shoulder Pain Expression Archive Database," Proc. IEEE Int'l Conf. Automatic Face and Gesture Recognition, Mar. 2011.
[44] Continuous Measurement System (CMS), http://measurement. psy.miami.educms.phtml , 2013.
[45] Y.-I. Tian, T. Kanade, and J.F. Cohn, "Recognizing Action Units for Facial Expression Analysis," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 23, no. 2, pp. 97-115, Feb. 2001.
[46] Y. Zhang, Q. Ji, Z. Zhu, and B. Yi, "Dynamic Facial Expression Analysis and Synthesis with MPEG-4 Facial Animation Parameters?" IEEE Trans. Circuits and Systems for Video Technology, vol. 18, no. 10, pp. 1383-1396, Oct. 2008.
[47] J.F. Cohn, Z. Ambadar, and P. Ekman, "Observer-Based Measurement of Facial Expression with the Facial Action Coding System," The Handbook of Emotion Elicitation and Assessment, J.A. Coan and J.J.B. Allen, eds., Oxford Univ. Press, 2007.
[48] P. Lucey, J.F. Cohn, T. Kanade, J. Saragih, Z. Ambadar, and I. Matthews, "The Extended Cohn-Kanade Dataset (CK+): A Complete Dataset for Action Unit and Emotion-Specified Expression," Proc. IEEE Conf. Computer Vision and Pattern Recognition Workshops (CVPRW), pp. 94-101, June 2010.
[49] M.J. Lyons, S. Akamatsu, M. Kamachi, and J. Gyoba, "Coding Facial Expressions with Gabor Wavelets," Proc. Third IEEE Int'l Conf. Automatic Face and Gesture Recognition, pp. 200-205, Apr. 1998.
[50] J.F. Cohn and T. Kanade, "Automated Facial Image Analysis for Measurement of Emotion Expression," The Handbook of Emotion Elicitation and Assessment, J.A. Coan and J.B. Allen, eds., pp. 222-238, Oxford Univ. Press, 2007.
[51] N. Alyüz, B. Gökberk, H. Dibeklioglu, A. Savran, A.A. Salah, L. Akarun, and B. Sankur, "3D Face Recognition Benchmarks on the Bosphorus Database with Focus on Facial Expressions," Proc. First COST 2101 Workshop Biometrics and Identity Management (BIOID '08), May 2008.
[52] L. Yin, X. Wei, Y. Sun, J. Wang, and M.J. Rosato, "A 3D Facial Expression Database for Facial Behavior Researchâ," Proc. Seventh IEEE Int'l Conf. Automatic Face and Gesture Recognition (FGR '06), pp. 211-216, 2006.
[53] M. Frank, J. Movellan, M. Bartlett, G. Littleworth, "RU-FACS-1 Database," Machine Perception Laboratory, Univ. of California, San Diego, http://mplab.ucsd.edu/wordpress?page_id=80 .
[54] W. Shangfei, L. Zhilei, L. Siliang, L. Yanpeng, W. Guobing, P. Peng, C. Fei, and W. Xufa, "A Natural Visible and Infrared Facial Expression Database for Expression Recognition and Emotion Inference," IEEE Trans. Multimedia, vol. 12, no. 7, pp. 682-691, Nov. 2010.
[55] J. Cohen, Statistical Power Analysis for the Social Sciences. Lawrence Erlbaum, 1988.
[56] T.F. Cootes, G.J. Edwards, and C.J. Taylor, "Active Appearance Models," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 23, no. 6, pp. 681-685, June 2001.
[57] T. Ojala, M. Pietikäinen, and D. Harwood, "A Comparative Study of Texture Measures with Classification Based on Feature Distributions," Pattern Recognition, vol. 29, no. 1, pp. 51-59, 1996.
[58] T. Ahonen, A. Hadid, and M. Pietikäinen, "Face Recognition with Local Binary Patterns," Proc. European Conf. Computer Vision (ECCV), pp. 469-481, 2004.
[59] C. Shan, S. Gong, and P.W. McOwan, "Facial Expression Recognition Based on Local Binary Patterns: A Comprehensive Study," Image and Vision Computing, vol. 27, no. 6, pp. 803-816, May 2009.
[60] N. Dalal and B. Triggs, "Histograms of Oriented Gradients for Human Detection," Computer Vision and Pattern Recognition, vol. 1, pp. 886-893, June 2005.
[61] C. Shu, X. Ding, and C. Fang, "Histogram of the Oriented Gradient for Face Recognition," Tsinghua Science Technology, vol. 16, no. 2, pp. 216-224, 2011.
[62] M.H. Mahoor and M. Abdel-Mottaleb, "A Multi-Modal Approach for Face Recognition Based on Ridge Images and Attributed Relational Graph," IEEE Trans. Information Forensics and Security, vol. 3, no. 3, pp. 431-440, Sept. 2008.
[63] M.S. Bartlett, G. Littlewort, M. Frank, C. Lainscsek, I. Fasel, and J. Movellan, "Recognizing Facial Expression: Machine Learning and Application to Spontaneous Behavior," Proc. IEEE Int'l Conf. Computer Vision and Pattern Recognition (CVPR '05), pp. 568-573, 2005.
[64] L. Cayton, "Algorithms for Manifold Learning," Technical Report CS2008-0923, Univ. of California, San Diego, 2005.
[65] M. Belkin and P. Niyogi, "Laplacian Eigenmaps for Dimensionality Reduction and Data Representation," Neural Computation, vol. 15, pp. 1373-1396, 2003.
[66] V.N. Vapnik, "An Overview of Statistical Learning Theory," IEEE Trans. Neural Networks, vol. 10, no. 5, pp. 988-999, Sept. 1999.
[67] M. Sokolova, N. Japkowicz, and S. Szpakowicz, "Beyond Accuracy, F-Score and ROC: A Family of Discriminant Measures for Performance Evaluation," Proc. Australian Conf. Artificial Intelligence, pp. 1015-1021, 2006.
[68] J. Cohen, "A Coefficient of Agreement for Nominal Scales," Educational and Psychological Measurement, vol. 20, no. 1, pp. 37-46, Apr. 1960.
[69] P.E. Shrout and J.L. Fleiss, "Intraclass Correlations: Uses in Assessing Rater Reliability," Psychological Bull., vol. 86, no. 2, pp. 420-428, Mar. 1979.
[70] C.-C. Chang and C.-J. Lin, "LIBSVM: A Library for Support Vector Machines," ACM Trans. Intelligent Systems and Technology, vol. 2, pp. 27:1-27:27, 2011.
20 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool