The Community for Technology Leaders
Green Image
Issue No. 01 - Jan.-March (2012 vol. 3)
ISSN: 1949-3045
pp: 18-31
A. Yazdani , Multimedia Signal Process. Group, Ecole Polytechniaue Fed. de Lausanne (EPFL), Lausanne, Switzerland
Jong-Seok Lee , Sch. of Integrated Technol., Yonsei Univ., Incheon, South Korea
M. Soleymani , Comput. Sci. Dept., Univ. of Geneva, Carouge, Switzerland
C. Muhl , Human Media Interaction Group, Univ. of Twente, Enschede, Netherlands
S. Koelstra , Sch. of Electron. Eng. & Comput. Sci., Queen Mary Univ. of London, London, UK
T. Ebrahimi , Multimedia Signal Process. Group, Ecole Polytechniaue Fed. de Lausanne (EPFL), Lausanne, Switzerland
T. Pun , Comput. Sci. Dept., Univ. of Geneva, Carouge, Switzerland
A. Nijholt , Human Media Interaction Group, Univ. of Twente, Enschede, Netherlands
I. Patras , Sch. of Electron. Eng. & Comput. Sci., Queen Mary Univ. of London, London, UK
ABSTRACT
We present a multimodal data set for the analysis of human affective states. The electroencephalogram (EEG) and peripheral physiological signals of 32 participants were recorded as each watched 40 one-minute long excerpts of music videos. Participants rated each video in terms of the levels of arousal, valence, like/dislike, dominance, and familiarity. For 22 of the 32 participants, frontal face video was also recorded. A novel method for stimuli selection is proposed using retrieval by affective tags from the last.fm website, video highlight detection, and an online assessment tool. An extensive analysis of the participants' ratings during the experiment is presented. Correlates between the EEG signal frequencies and the participants' ratings are investigated. Methods and results are presented for single-trial classification of arousal, valence, and like/dislike ratings using the modalities of EEG, peripheral physiological signals, and multimedia content analysis. Finally, decision fusion of the classification results from different modalities is performed. The data set is made publicly available and we encourage other researchers to use it for testing their own affective state estimation methods.
INDEX TERMS
Web sites, electroencephalography, emotion recognition, image classification, information retrieval, multimedia computing, neurophysiology, state estimation, video signal processing, state estimation methods, DEAP, emotion analysis, multimodal data set, human affective states, electroencephalogram, peripheral physiological signals, music videos, arousal, dominance, familiarity, frontal face video, stimuli selection, Web site, video highlight detection, online assessment tool, EEG signal frequencies, single-trial classification, multimedia content analysis, decision fusion, Videos, Databases, Electroencephalography, Motion pictures, Multimedia communication, Visualization, Face, affective computing., Emotion classification, EEG, physiological signals, signal processing, pattern classification
CITATION
A. Yazdani, Jong-Seok Lee, M. Soleymani, C. Muhl, S. Koelstra, T. Ebrahimi, T. Pun, A. Nijholt, I. Patras, "DEAP: A Database for Emotion Analysis ;Using Physiological Signals", IEEE Transactions on Affective Computing, vol. 3, no. , pp. 18-31, Jan.-March 2012, doi:10.1109/T-AFFC.2011.15
196 ms
(Ver 3.3 (11022016))