The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.03 - July-September (2012 vol.3)
pp: 379-385
Julien Fleureau , Tehnicolor R&D France, Cesson-Sevigne
Philippe Guillotel , Tehnicolor R&D France, Cesson-Sevigne
Quan Huynh-Thu , Tehnicolor R&D France, Cesson-Sevigne
ABSTRACT
In this paper, we propose a methodology to build a real-time affect detector dedicated to video viewing and entertainment applications. This detector combines the acquisition of traditional physiological signals, namely, galvanic skin response, heart rate, and electromyogram, and the use of supervised classification techniques by means of Gaussian processes. It aims at detecting the emotional impact of a video clip in a new way by first identifying emotional events in the affective stream (fast increase of the subject excitation) and then by giving the associated binary valence (positive or negative) of each detected event. The study was conducted to be as close as possible to realistic conditions by especially minimizing the use of active calibrations and considering on-the-fly detection. Furthermore, the influence of each physiological modality is evaluated through three different key-scenarios (mono-user, multi-user and extended multi--user) that may be relevant for consumer applications. A complete description of the experimental protocol and processing steps is given. The performances of the detector are evaluated on manually labeled sequences, and its robustness is discussed considering the different single and multi-user contexts.
INDEX TERMS
Feature extraction, Detectors, Physiology, Kernel, Electromyography, Context, Databases, Gaussian processes, Affective computing, emotion detection, physiological signals, machine learning
CITATION
Julien Fleureau, Philippe Guillotel, Quan Huynh-Thu, "Physiological-Based Affect Event Detector for Entertainment Video Applications", IEEE Transactions on Affective Computing, vol.3, no. 3, pp. 379-385, July-September 2012, doi:10.1109/T-AFFC.2012.2
REFERENCES
[1] A. Sarcevic and M. Lesk, “Searching for Emotional Content in Digital Video,” Proc. ACM CHI Workshop HCI and the Face, pp. 1-4, 2006.
[2] C. Calcanis, V. Callaghan, M. Gardner, and M. Walker, “Towards End-User Physiological Profiling for Video Recommendation Engines,” Proc. Int'l Conf. Intelligent Environments, pp. 1-5, 2008.
[3] A.G. Money and H. Agius, “Analysing User Physiological Responses for Affective Video Summarisation,” Displays, vol. 30, no. 2, pp. 59-70, Apr. 2009.
[4] M. Pantic and L.J.M. Rothkrantz, “Automatic Analysis of Facial Expressions: The State of the Art,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 12, pp. 1424-1445, Dec. 2000.
[5] G. Lu and F. Yang, “Limitations of Oximetry to Measure Heart Rate Variability Measures,” Cardiovascular Eng., vol. 9, pp. 119-125, 2009.
[6] P. Lang, “The Emotion Probe,” Am. Psychologist, vol. 50, no. 5, pp. 372-385, 1995.
[7] J. Wagner and E. Andre, “From Physiological Signals to Emotions: Implementing and Comparing Selected Methods for Feature Extraction and Classification,” Proc. IEEE Int'l Conf. Multimedia and Expo, pp. 940-943, 2005.
[8] E. Ganglbauer, J. Schrammel, S. Deutsch, and M. Tscheligi, “Applying Psychophysiological Methods for Measuring User Experience: Possibilities Challenges, and Feasibility,” Proc. Workshop User Experience Evaluation Methods, 2009.
[9] R. Calvo and S. D$^\prime$ Mello, “Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications,” IEEE Trans. Affective Computing, vol. 1, no. 1, pp. 18-37, Jan.-June 2010.
[10] S. Rothwell, B. Lehane, C. Chan, A. Smeaton, N. O$^\prime$ Connor, G. Jones, and D. Diamond, “The CDVPlex Biometric Cinema: Sensing Physiological Responses to Emotional Stimuli in Film,” Proc. Int'l Conf. Pervasive Computing, pp. 1-4, 2006.
[11] R. Matthews, N.J. McDonald, P. Hervieux, P.J. Turner, and M.A. Steindorf, “A Wearable Physiological Sensor Suite for Unobtrusive Monitoring of Physiological and Cognitive State,” Proc. IEEE Int'l Conf. Eng. Medicine and Biology Soc., pp. 5276-5281, 2007.
[12] R.R. Fletcher, K. Dobson, M.S. Goodwin, H. Eydgahi, O. Wilder-Smith, D. Fernholz, Y. Kuboyama, E.B. Hedman, M.Z. Poh, and R.W. Picard, “iCalm: Wearable Sensor and Network Architecture for Wirelessly Communicating and Logging Autonomic Activity,” IEEE Trans. Information Technology in Biomedicine, vol. 14, no. 2, pp. 215-223, Mar. 2010.
[13] M. Soleymani, G. Chanel, J.J.M. Kierkels, and T. Pun, “Affective Characterization of Movie Scenes Based on Multimedia Content Analysis and User's Physiological Emotional Responses,” Proc. IEEE Int'l Symp. Multimedia, pp. 228-235, Dec. 2008.
[14] L. Canini, S. Gilroy, M. Cavazza, R. Leonardi, and S. Benini, “Users' Response to Affective Film Content: A Narrative Perspective,” Proc. Int'l Workshop Content-Based Multimedia Indexing, vol. 1, no. 1, pp. 1-6, 2010.
[15] C.L. Lisetti and F. Nasoz, “Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals,” EURASIP J. Applied Signal Processing, vol. 2004, no. 11, pp. 1672-1687, 2004.
[16] J. Russell, “A Circumplex Model of Affect,” J. Personality and Social Psychology, vol. 39, no. 6, pp. 1161-1178, 1980.
[17] C. Lee, S.K. Yoo, Y. Park, N. Kim, K. Jeong, and B. Lee, “Using Neural Network to Recognize Human Emotions from Heart Rate Variability and Skin Resistance,” Proc. IEEE Conf. Eng. Medicine Biology Soc., vol. 5, pp. 5523-5, Jan. 2005.
[18] J. Bailenson, E. Pontikakis, I. Mauss, J. Gross, M. Jabon, C. Hutcherson, C. Nass, and O. John, “Real-Time Classification of Evoked Emotions Using Facial Feature Tracking and Physiological Responses,” Int'l J. Human Computer Studies, vol. 66, no. 5, pp. 303-317, May 2008.
[19] R. Mandryk, K. Inkpen, and T. Calvert, “Using Psychophysiological Techniques to Measure User Experience with Entertainment Technologies,” Behaviour and Information Technology, vol. 25, no. 2, pp. 141-158, Mar. 2006.
[20] C. Rasmussen and K. Williams, Gaussian Processes for Machine Learning. MIT Press, 2006.
[21] T. Minka, “Expectation Propagation for Approximate Bayesian Inference,” Proc. Conf. Uncertainty in Artificial Intelligence, vol. 17, pp. 362-369, 2001.
[22] D.G. Altman and J.M. Bland, “Diagnostic Tests. 1: Sensitivity and Specificity,” British Medical J., vol. 308, no. 6943, p. 1552, 1994.
21 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool