This Article 
 Bibliographic References 
 Add to: 
Emotion Recognition from Brain Signals Using Hybrid Adaptive Filtering and Higher Order Crossings Analysis
July-December 2010 (vol. 1 no. 2)
pp. 81-97
Panagiotis C. Petrantonakis, Aristotle University of Thessaloniki, Thessaloniki
Leontios J. Hadjileontiadis, Aristotle University of Thessaloniki, Thessaloniki
This paper aims at providing a new feature extraction method for a user-independent emotion recognition system, namely, HAF-HOC, from electroencephalograms (EEGs). A novel filtering procedure, namely, Hybrid Adaptive Filtering (HAF), for an efficient extraction of the emotion-related EEG-characteristics was developed by applying Genetic Algorithms to the Empirical Mode Decomposition-based representation of EEG signals. In addition, Higher Order Crossings (HOCs) analysis was employed for feature extraction realization from the HAF-filtered signals. The introduced HAF-HOC scheme incorporated four different classification methods to accomplish a robust emotion recognition performance. Through a series of facial-expression image projection, as a Mirror Neuron System-based emotion elicitation process, EEG data related to six basic emotions (happiness, surprise, anger, fear, disgust, and sadness) have been acquired from 16 healthy subjects using three EEG channels. Experimental results from the application of the HAF-HOC to the collected EEG data and comparison with previous approaches have shown that the HAF-HOC scheme clearly surpasses the latter in the field of emotion recognition from brain signals for the discrimination of up to six distinct emotions, providing higher classification rates up to 85.17 percent. The promising performance of the HAF-HOC surfaces the value of EEG signals within the endeavor of realizing more pragmatic, affective human-machine interfaces.

[1] B. Reeves and C. Nass, The Media Equation. How People Treat Computers, Television, and New Media Like Real People and Places. CSLI, Cambridge Univ. Press, 1996.
[2] T. Bradberry and J. Greaves, Emotional Intelligence 2.0. Publishers Group West, 2009.
[3] R.W. Picard, Affective Computing. MIT Press, 1997.
[4] I. Cohen, A. Garg, and T.S. Huang, “Emotion Recognition from Facial Expressions Using Multilevel HMM,” Proc. Neural Information Processing Systems Workshop Affective Computing, NIPS_emotion.pdf, 2000.
[5] F. Bourel, C.C. Chibelushi, and A.A. Low, “Robust Facial Expression Recognition Using a State-Based Model of Spatially-Localized Facial Dynamics,” Proc. Fifth IEEE Int'l Conf. Automatic Face and Gesture Recognition, pp. 106-111, 2002.
[6] J.J. Lien, T. Kanade, J.F. Cohn, and C. Li, “Automated Facial Expression Recognition Based on FACS Action Units,” Proc. Third IEEE Conf. Automatic Face and Gesture Recognition, pp. 390-395, 1998.
[7] B. Schuller, S. Reiter, R. Mueller, M. Al-Hames, and G. Rigoll, “Speaker Independent Speech Emotion Recognition by Ensemble Classification,” Proc. Sixth Int'l Conf. Multimedia and Expo, pp. 864-867, 2005.
[8] C. Busso, Z. Deng, S. Yildirim, M. Buut, C.M. Lee, A. Kazemzadeh, S. Lee, U. Neumann, and S. Narayanan, “Analysis of Emotion Recognition Using Facial Expressions, Speech and Multimodal Information,” Proc. Sixth Int'l Conf. Multimodal Interfaces, pp. 205-211, 2004.
[9] R.W. Picard, E. Vyzas, and J. Healey, “Toward Machine Emotional Intelligence: Analysis of Affective Physiological State,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 23, no. 10, pp. 1175-1191, Oct. 2001.
[10] F. Nasoz, C.L. Lisetti, K. Alvarez, and N. Finkelstein, “Emotion Recognition from Physiological Signals for User Modeling of Affect,” Proc. Ninth Int'l Conf. User Model, http://www.eurecom. fr/utilpublidownload.en.htm?id=1806 , June 2003.
[11] C.L. Lisetti and F. Nasoz, “Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals,” EURASIP J. Applied Signal Processing, vol. 11, pp. 1672-1687, 2004.
[12] D.N. McIntosh, A. Reichmann-Decker, P. Winkielman, and J.L. Wilbarger, “When the Social Mirror Breaks: Deficits in Automatic, But Not Voluntary, Mimicry of Emotional Facial Expressions in Autism,” Developmental Science, vol. 9, pp. 295-302, 2006.
[13] D.E. Goldberg, Genetic Algorithms in Search, Optimization & Machine Learning. Addisson-Wesley, 1989.
[14] N. Huang, Z. Shen, S. Long, M. Wu, H.H. Shih, N.C. Zheng, N.C. Yen, C. Tung, and H. Liu, “The Empirical Mode Decomposition and Hilbert Spectrum for Nonlinear and Nonstationary Time Series Analysis,” Proc. Royal Soc. London A, vol. 454, pp. 903-995, 1998.
[15] P.C. Petrantonakis and L.J. Hadjileontiadis, “Emotion Recognition from EEG Using Higher Order Crossings,” IEEE Trans. Information Technology in Biomedicine, vol. 14, no. 2, pp. 186-197, Mar. 2010.
[16] G. Rizzolatti and L. Craighero, “The Mirror-Neuron System,” Ann. Rev. Neuroscience, vol. 27, pp. 169-192, 2004.
[17] W.B. Cannon, “The James-Lange Theory of Emotions: A Critical Examination and an Alternative Theory,” Am. J. Psychology, vol. 39, pp. 106-124, 1927.
[18] C. Darwin, The Expression of Emotion in Man and Animals. Philosophical Library (Original work published in 1872), 1955.
[19] W. James, The Principles of Psychology. Holt, Rinehart and Winston, 1890.
[20] P. Ekman, “Expression and the Nature of Emotion,” Approaches to Emotion, K. Scherer and P. Ekman, eds., Erlbaum, 1984.
[21] P. Ekman, R.W. Levenson, and W.V. Friesen, “Emotions Differ in Autonomic Nervous System Activity,” Science, vol. 221, pp. 1208-1210, 1983.
[22] R.J. Davidson, G.E. Schwartz, C. Saron, J. Bennett, and D.J. Goleman, “Frontal versus Parietal EEG Asymmetry During Positive and Negative Affect,” Psychophysiology, vol. 16, pp. 202-203, 1979.
[23] R.J. Davidson, P. Ekman, C.D. Saron, J.A. Senulis, and W.V. Friesen, “Approach-Withdrawal and Cerebral Asymmetry: Emotional Expression and Brain Physiology,” J. Personality and Social Psychology, vol. 58, pp. 330-341, 1990.
[24] H. Jasper, “The Ten-Twenty Electrode System of the International Federation,” Electroencephalography and Clinical Neurophysiology, vol. 39, pp. 371-375, 1958.
[25] W.J.H. Nauta, “The Problem of the Frontal Lobe: A Reinterpretation,” J. Psychiatric Research, vol. 8, pp. 167-187, 1971.
[26] R.J. Davidson, “What Does the Prefrontal Cortex ‘Do’ in Affect: Perspectives on Frontal EEG Asymmetry Research,” Biological Psychology, vol. 67, pp. 219-233, 2004.
[27] D. Hagemann, E. Naumann, A. Lurken, G. Becker, S. Maier, and D. Bartussek, “EEG Asymmetry, Dispositional Mood and Personality,” Personality and Individual Differences, vol. 27, pp. 541-568, 1999.
[28] J.A. Coan and J.J.B. Allen, “Frontal EEG Asymmetry as a Moderator and Mediator of Emotion,” Biological Psychology, vol. 67, pp. 7-49, 2004.
[29] L.I. Aftanas, A.A. Varlamov, S.V. Pavlov, V.P. Makhnev, and N.V. Reva, “Affective Picture Processing: Event-Related Synchronization within Individually Defined Human Theta Band Is Modulated by Valence Dimension,” Neuroscience Letters, vol. 303, pp.115-118, 2001.
[30] G. Pfurtscheller and F.H.L. da Silva, “Event-Related EEG/MEG Synchronization and Desynchronization: Basic Principles,” Clinical Neurophysiology, vol. 110, pp. 1842-1857, 1999.
[31] E. Oztop, M. Kawato, and M. Arbib, “Mirror Neurons and Imitation: A Computationally Guided Review,” Neural Networks, vol. 19, pp. 254-271, 2006.
[32] T.W. Lee, R.J. Dolan, and H.D. Critchley, “Controlling Emotional Expression: Behavioral and Neural Correlates of Nonimitative Emotional Responses,” Cerebral Cortex, vol. 8, pp. 104-113, 2008.
[33] T.W. Lee, O. Josephs, R.J. Dolan, and H.D. Critchley, “Imitating Expressions: Emotion-Specific Neural Substrates in Facial Mimicry,” Social Cognitive and Affective Neuroscience, vol. 1, pp. 122-135, 2006.
[34] L. Carr, M. Iacoboni, M.C. Dubeau, J.C. Mazziotta, and G.L. Lenzi, “Neural Mechanisms of Empathy in Humans: A Relay from Neural Systems for Imitation to Limbic Areas,” Proc. Nat'l Academy of Sciences, vol. 100, pp. 5497-5502, 2003.
[35] B. Wicker, C. Keysers, J. Plailly, J.P. Royet, V. Gallese, and G. Rizzolatti, “Both of Us Disgusted in My Insula: The Common Neural Basis of Seeing and Feeling Disgust,” Neuron, vol. 40, pp.655-664, 2003.
[36] A. Choppin, “EEG-Based Human Interface for Disabled Individuals: Emotion Expression with Neural Networks,” master's thesis, Tokyo Inst. of Tech nology, 2000.
[37] K. Takahashi, “Remarks on Emotion Recognition from Bio-Potential Signals,” Proc. Second Int'l Conf. Autonomous Robots and Agents, pp. 186-191, 2004.
[38] G. Chanel, J. Kronegg, D. Grandjean, and T. Pun, “Emotion Assessment: Arousal Evaluation Using EEG and Peripheral Physiological Signals,” technical report, Univ. of Geneva, 2005.
[39] D.O. Bos,“EEG-Based Emotion Recognition: The Influence of Visual and Auditory Stimuli,” verslagen/ capita-selectaCS-Oude_Bos-Danny.pdf, 2006.
[40] Z. Khalili and M. Moradi, “Emotion Detection Using Brain and Peripheral Signals,” Proc. Biomedical Eng. Conf., pp. 1-4, 2008.
[41] R. Horlings, D. Datcu, and L.J.M. Rothkrantz, “Emotion Recognition Using Brain Activity,” Proc. Int'l Conf. Computer Systems and Technologies, pp. 1-6, 2008.
[42] M. Murugappan, M. Rizon, R. Nagarajan, S. Yaacob, I. Zunaidi, and D. Hazry, “Lifting Scheme for Human Emotion Recognition Using EEG,” Proc. Int'l Symp. Information Technology, pp. 1-7, 2008.
[43] K.G. Srinivasa, K.R. Venugopal, and L.M. Patnaik, “Feature Extraction Using Fuzzy C-Means Clustering for Data Mining Systems,” Int'l J. Computer Science and Network Security, vol. 6, pp.230-236, 2006.
[44] J.J. De Gruijter and A.B. McBratney, “A Modified Fuzzy K Means for Predictive Classification,” Classification and Related Methods of Data Analysis, H.H. Bock, ed., pp. 97-104, Elsevier Science, 1988.
[45] I. Daubechies, “Orthonormal Bases of Compactly Supported Wavelets,” Comm. Pure and Applied Math., vol. 41, pp. 909-996, 1988.
[46] A. Heraz and C. Frasson, “Predicting the Three Major Dimensions of the Learner's Emotions from Brainwaves,” Int'l J. Computer Science, vol. 2, no. 3, pp. 187-193, 2008.
[47] K. Schaaff and T. Schultz, “Towards an EEEG-Based Emotion Recognizer for Humanoid Robots,” Proc. 18th IEEE Int'l Symp. Robot and Human Interactive Comm., pp. 792-796, 2009.
[48] B. Kedem, Time Series Analysis by Higher Order Crossings. IEEE Press, 1994.
[49] P.J. Lang, M.M. Bradley, and B.N. Cuthbert, “International Affective Picture System (IAPS): Affective Ratings of Pictures and Instruction Manual,” Technical Report A-8, Univ. of Florida, 2008.
[50] M.M. Bradley and P.J. Lang, “The International Affective Digitized Sounds (2nd Edition; IADS-2): Affective Ratings of Sounds and Instruction Manual,” Technical Report B-3, Univ. of Florida, 2007.
[51] P. Ekman et al., “Universals and Cultural Differences in the Judgments of Facial Expressions of Emotion,” J. Personality and Social Psychology, vol. 53, no. 4, pp. 712-717, 1987.
[52] S. D'Mello, T. Jackson, S. Craig, B. Morgan, P. Chipman, H. White, N. Person, B. Kort, R. el Kaliouby, R.W. Picard, and A. Graesser, “AutoTutor Detects and Responds to Learners Affective and Cognitive States,” Proc. Workshop Emotional and Cognitive Issues at the Int'l Conf. Intelligent Tutoring Systems, pp. 31-43, 2008.
[53] P. Ekman and W.V. Friesen, “Pictures of Facial Affect,” Human Interaction Laboratory, Univ. of California Medical Center, 1976.
[54] B. Graimann, J.E. Huggins, S.P. Levine, and G. Pfurtscheller, “Visualization of Significant ERD/ERS Patterns in Multichannel EEG and ECoG Data,” Clinical Neurophysiology, vol. 113, pp. 43-47, 2002.
[55] K. Coburn and M. Moreno, “Facts and Artifacts in Brain Electrical Activity Mapping,” Brain Topography, vol. 1, pp. 37-45, 1988.
[56] D.O. Olguin, “Adaptive Digital Filtering Algorithms for the Elimination of Power Line Interference in Electroencephalographic Signals,” master's thesis, Inst. Tecnologico y de Estudios Superiores de Monterrey, 2005.
[57] M. Fatourechi, A. Bashashati, R.K. Ward, and G.E. Birch, “EMG and EOG Artifacts in Brain Computer Interface Systems: A Survey,” Clinical Neurophysiology, vol. 118, pp. 480-494, 2007.
[58] P. Flandrin, G. Rilling, and P. Goncalves, “Empirical Mode Decomposition as a Filter Bank,” IEEE Signal Processing Letters, vol. 11, no. 2, pp. 112-114, Feb. 2004.
[59] P.C. Petrantonakis and L.J. Hadjileontiadis, “EEG-Based Emotion Recognition Using Hybrid Filtering and Higher Order Crossings,” Proc. Int'l Conf. Affective Computing and Intelligent Interaction, pp.147-152, 2009.
[60] M. Katz, “Fractals and the Analysis of Waveforms,” Computers in Biology and Medicine, vol. 18, pp. 145-156, 1988.
[61] A. Accardo, M. Affinito, M. Carrozzi, and F. Bouquet, “Use of Fractal Dimension for the Analysis of Electroencephalographic Time Series,” Biological Cybernetics, vol. 77, pp. 339-350, 1997.
[62] A. Petrosian, “Kolmogorov Complexity of Finite Sequences and Recognition of Different Preictal EEG Patterns,” Proc. IEEE Symp. Computer-Based Medical Systems, pp. 212-217, 1995.
[63] T. Higuchi, “Approach to an Irregular Time Series on the Basis of the Fractal Theory,” Physica D, vol. 31, pp. 277-283, 1988.
[64] A. Papoulis, Probability, Random Variables, and Stochastic Processes, third ed. McGraw-Hill, 1991.
[65] W.J. Krzanowski, Principles of Multivariate Analysis. Oxford Univ. Press, 1988.
[66] T. Mitchell, Machine Learning. McGraw-Hill, 1997.
[67] P.C. Mahalanobis, “On the Generalized Distance in Statistics,” Proc. Nat'l Inst. of Science of India, vol. 2, pp. 49-55, 1936.
[68] N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods. Cambridge Univ. Press, 2000.
[69] R. Duda, P. Hart, and D. Stork, Pattern Classification. John Wiley & Sons, 2001.

Index Terms:
EEG, emotion recognition, EMD, genetic algorithms, higher order crossings analysis, hybrid adaptive filtering, mirror neuron system.
Panagiotis C. Petrantonakis, Leontios J. Hadjileontiadis, "Emotion Recognition from Brain Signals Using Hybrid Adaptive Filtering and Higher Order Crossings Analysis," IEEE Transactions on Affective Computing, vol. 1, no. 2, pp. 81-97, July-Dec. 2010, doi:10.1109/T-AFFC.2010.7
Usage of this product signifies your acceptance of the Terms of Use.