This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Physiological-Based Affect Event Detector for Entertainment Video Applications
July-September 2012 (vol. 3 no. 3)
pp. 379-385
Julien Fleureau, Tehnicolor R&D France, Cesson-Sevigne
Philippe Guillotel, Tehnicolor R&D France, Cesson-Sevigne
Quan Huynh-Thu, Tehnicolor R&D France, Cesson-Sevigne
In this paper, we propose a methodology to build a real-time affect detector dedicated to video viewing and entertainment applications. This detector combines the acquisition of traditional physiological signals, namely, galvanic skin response, heart rate, and electromyogram, and the use of supervised classification techniques by means of Gaussian processes. It aims at detecting the emotional impact of a video clip in a new way by first identifying emotional events in the affective stream (fast increase of the subject excitation) and then by giving the associated binary valence (positive or negative) of each detected event. The study was conducted to be as close as possible to realistic conditions by especially minimizing the use of active calibrations and considering on-the-fly detection. Furthermore, the influence of each physiological modality is evaluated through three different key-scenarios (mono-user, multi-user and extended multi--user) that may be relevant for consumer applications. A complete description of the experimental protocol and processing steps is given. The performances of the detector are evaluated on manually labeled sequences, and its robustness is discussed considering the different single and multi-user contexts.
Index Terms:
Feature extraction,Detectors,Physiology,Kernel,Electromyography,Context,Databases,Gaussian processes,Affective computing,emotion detection,physiological signals,machine learning
Citation:
Julien Fleureau, Philippe Guillotel, Quan Huynh-Thu, "Physiological-Based Affect Event Detector for Entertainment Video Applications," IEEE Transactions on Affective Computing, vol. 3, no. 3, pp. 379-385, July-Sept. 2012, doi:10.1109/T-AFFC.2012.2
Usage of this product signifies your acceptance of the Terms of Use.