The Community for Technology Leaders
Green Image
Issue No. 01 - Jan.-March (2012 vol. 3)
ISSN: 1949-3045
pp: 42-55
M. Soleymani , Comput. Sci. Dept., Univ. of Geneva, Carouge, Switzerland
J. Lichtenauer , Dept. of Comput., Imperial Coll. London, London, UK
T. Pun , Comput. Sci. Dept., Univ. of Geneva, Carouge, Switzerland
M. Pantic , Dept. of Comput., Imperial Coll. London, London, UK
MAHNOB-HCI is a multimodal database recorded in response to affective stimuli with the goal of emotion recognition and implicit tagging research. A multimodal setup was arranged for synchronized recording of face videos, audio signals, eye gaze data, and peripheral/central nervous system physiological signals. Twenty-seven participants from both genders and different cultural backgrounds participated in two experiments. In the first experiment, they watched 20 emotional videos and self-reported their felt emotions using arousal, valence, dominance, and predictability as well as emotional keywords. In the second experiment, short videos and images were shown once without any tag and then with correct or incorrect tags. Agreement or disagreement with the displayed tags was assessed by the participants. The recorded videos and bodily responses were segmented and stored in a database. The database is made available to the academic community via a web-based system. The collected data were analyzed and single modality and modality fusion results for both emotion recognition and implicit tagging experiments are reported. These results show the potential uses of the recorded modalities and the significance of the emotion elicitation protocol.
visual databases, emotion recognition, emotion elicitation protocol, multimodal database, affect recognition, implicit tagging, MAHNOB-HCI, affective stimuli, emotion recognition, face video recording, audio signal recording, eye gaze data recording, peripheral-central nervous system physiological signals, arousal, valence, dominance, predictability, emotional keywords, Web-based system, Databases, Videos, Physiology, Humans, Cameras, Tagging, Emotion recognition, affective computing., Emotion recognition, EEG, physiological signals, facial expressions, eye gaze, implicit tagging, pattern classification

J. Lichtenauer, M. Soleymani, T. Pun and M. Pantic, "A Multimodal Database for Affect Recognition and Implicit Tagging," in IEEE Transactions on Affective Computing, vol. 3, no. , pp. 42-55, 2012.
96 ms
(Ver 3.3 (11022016))