The Community for Technology Leaders
Green Image
Issue No. 03 - July-September (2012 vol. 3)
ISSN: 1949-3045
pp: 379-385
Philippe Guillotel , Tehnicolor R&D France, Cesson-Sevigne
Julien Fleureau , Tehnicolor R&D France, Cesson-Sevigne
Quan Huynh-Thu , Tehnicolor R&D France, Cesson-Sevigne
ABSTRACT
In this paper, we propose a methodology to build a real-time affect detector dedicated to video viewing and entertainment applications. This detector combines the acquisition of traditional physiological signals, namely, galvanic skin response, heart rate, and electromyogram, and the use of supervised classification techniques by means of Gaussian processes. It aims at detecting the emotional impact of a video clip in a new way by first identifying emotional events in the affective stream (fast increase of the subject excitation) and then by giving the associated binary valence (positive or negative) of each detected event. The study was conducted to be as close as possible to realistic conditions by especially minimizing the use of active calibrations and considering on-the-fly detection. Furthermore, the influence of each physiological modality is evaluated through three different key-scenarios (mono-user, multi-user and extended multi--user) that may be relevant for consumer applications. A complete description of the experimental protocol and processing steps is given. The performances of the detector are evaluated on manually labeled sequences, and its robustness is discussed considering the different single and multi-user contexts.
INDEX TERMS
Feature extraction, Detectors, Physiology, Kernel, Electromyography, Context, Databases, Gaussian processes, Affective computing, emotion detection, physiological signals, machine learning
CITATION
Philippe Guillotel, Julien Fleureau, Quan Huynh-Thu, "Physiological-Based Affect Event Detector for Entertainment Video Applications", IEEE Transactions on Affective Computing, vol. 3, no. , pp. 379-385, July-September 2012, doi:10.1109/T-AFFC.2012.2
107 ms
(Ver 3.3 (11022016))