2013 IEEE 29th International Conference on Data Engineering Workshops (ICDEW) (2006)
Apr. 3, 2006 to Apr. 7, 2006
I. Pitas , Aristotle University of Thessaloniki, Greece
I. Kotsia , Aristotle University of Thessaloniki, Greece
O. Martin , Universit? catholique de Louvain, Belgium
B. Macq , Universit? catholique de Louvain, Belgium
This paper presents an audio-visual emotion database that can be used as a reference database for testing and evaluating video, audio or joint audio-visual emotion recognition algorithms. Additional uses may include the evaluation of algorithms performing other multimodal signal processing tasks, such as multimodal person identification or audio-visual speech recognition. This paper presents the difficulties involved in the construction of such a multimodal emotion database and the different protocols that have been used to cope with these difficulties. It describes the experimental setup used for the experiments and includes a section related to the segmentation and selection of the video samples, in such a way that the database contains only video sequences carrying the desired affective information. This database is made publicly available for scientific research purposes.
I. Pitas, I. Kotsia, O. Martin, B. Macq, "The eNTERFACE?05 Audio-Visual Emotion Database", 2013 IEEE 29th International Conference on Data Engineering Workshops (ICDEW), vol. 00, no. , pp. 8, 2006, doi:10.1109/ICDEW.2006.145