The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.02 - April-June (2013 vol.4)
pp: 161-172
Stelios K. Hadjidimitriou , Aristotle University of Thessaloniki, Thessaloniki
Leontios J. Hadjileontiadis , Aristotle University of Thessaloniki, Thessaloniki
ABSTRACT
A time-windowing feature extraction approach based on time-frequency (TF) analysis is adopted here to investigate the time-course of the discrimination between musical appraisal electroencephalogram (EEG) responses, under the parameter of familiarity. An EEG data set, formed by the responses of nine subjects during music listening, along with self-reported ratings of liking and familiarity, is used. Features are extracted from the beta (13-30 Hz) and gamma (30-49 Hz) EEG bands in time windows of various lengths, by employing three TF distributions (spectrogram, Hilbert-Huang spectrum, and Zhao-Atlas-Marks transform). Subsequently, two classifiers ($(k)$-NN and SVM) are used to classify feature vectors in two categories, i.e., "likeâ and "dislike,â under three cases of familiarity, i.e., regardless of familiarity (LD), familiar music (LDF), and unfamiliar music (LDUF). Key findings show that best classification accuracy (CA) is higher and it is achieved earlier in the LDF case {$(91.02 \pm 1.45\%)$ (7.5-10.5 s)} as compared to the LDUF case {$(87.10 \pm 1.84\%)$ (10-15 s)}. Additionally, best CAs in LDF and LDUF cases are higher as compared to the general LD case {$(85.28 \pm 0.77\%)$}. The latter results, along with neurophysiological correlates, are further discussed in the context of the existing literature on the time-course of music-induced affective responses and the role of familiarity.
INDEX TERMS
signal processing, Appraisal classification, EEG, familiarity, music, pattern recognition
CITATION
Stelios K. Hadjidimitriou, Leontios J. Hadjileontiadis, "EEG-Based Classification of Music Appraisal Responses Using Time-Frequency Analysis and Familiarity Ratings", IEEE Transactions on Affective Computing, vol.4, no. 2, pp. 161-172, April-June 2013, doi:10.1109/T-AFFC.2013.6
REFERENCES
[1] R.B. Zajonc, "Attitudinal Effects of Mere Exposure," J. Personality Social Psychology, vol. 9, no. 2, pp. 1-27, 1968.
[2] R.F. Borstein, "Exposure and Affect: Overview and Meta-Analysis of Research 1968-1987," Psychological Bull., vol. 106, no. 2, pp. 265-289, 1989.
[3] D.E. Berlyne, "Novelty, Complexity, and Hedonic Value," Perception Psychophysics, vol. 8, no. 5, pp. 279-286, 1970.
[4] D.J. Stang, "Methodological Factors in Mere Exposure Research," Psychological Bull., vol. 81, no. 12, pp. 1014-1025, 1974.
[5] R.F. Bornstein and P.R. D'Agostino, "The Attribution and Discounting of Perceptual Fluency: Preliminary Tests of Perceptual Fluency/Attributional Model of the Mere Exposure Effect," Social Cognition, vol. 12, no. 2, pp. 103-128, 1994.
[6] R. Reber, N. Schwarz, and P. Winkielman, "Processing Fluency and Aesthetic Pleasure: Is Beauty in the Perceiver's Processing Experience?" Personality Social Psychology Rev., vol. 8, no. 4, pp. 364-382, 2004.
[7] D.J. Hargreaves, "The Effects of Repetition on Liking for Music," J. Music Education, vol. 32, no. 1, pp. 35-47, 1984.
[8] K.K. Szpunar, E.G. Schellenberg, and P. Pliner, "Liking and Memory for Musical Stimuli as a Function of Exposure," J. Experimental Psychology: Learning, Memory, and Cognition, vol. 30, no. 2, pp. 370-381, 2004.
[9] E.G. Schellenberg, I. Peretz, and S. Vieillard, "Liking for Happy-and Sad-Sounding Music: Effects of Exposure," Cognition Emotion, vol. 22, no. 2, pp. 218-237, 2008.
[10] P. Hunter and E.G. Schellenberg, "Interactive Effects of Personality and Frequency of Exposure on Liking for Music," Personality Individual Differences, vol. 50, no. 2, pp. 175-179, 2010.
[11] J.P. Bachorik, M. Bangert, and P. Loui, "Emotion in Motion: Investigating the Time-Course of Emotional Judgments of Musical Stimuli," Music Perception, vol. 26, no. 4, pp. 355-364, 2009.
[12] E.K. Schubert, "Modeling Perceived Emotion with Continuous Musical Features," Music Perception, vol. 21, no. 4, pp. 561-585, 2004.
[13] E. Bigand, S. Filipic, and P. Lalitte, "The Time Course of Emotional Responses to Music," Annals of the York Academy Science, vol. 1060, pp. 429-437, 2005.
[14] S.K. Hadjidimitriou and L.J. Hadjileontiadis, "Towards an EEG-Based Recognition of Music Liking Using Time-Frequency Analysis," IEEE Trans. Biomedical Eng., vol. 59, no. 12, pp. 3498-3510, Dec. 2012.
[15] G. Pfurtscheller and F.H. Lopes da Silva, "Event-Related EEG/MEG Synchronization and Desynchronization: Basic Principles," Clinical Neurophysiology, vol. 110, pp. 1842-1857, 1999.
[16] F. Auger, P. Flandrin, P. Goncalves, and O. Lemoine, Time-Frequency Toolbox Tutorial. Centre Nat'l de la Recherche Scientifique (CNRS)/Rice Univ., 1996.
[17] B. Boashash, "Theory of Quadratic TFDs," Time Frequency Signal Analysis and Processing: A Comprehensive Reference, B. Boashash, ed., pp. 59-82, Elsevier, 2003.
[18] L. Cohen, "Time-Frequency Distributions—A Review," Proc. IEEE, vol. 77, no. 7, pp. 941-981, July 1989.
[19] Y. Zhao, L.E. Atlas, and R.J. Marks, "The Use of Cone-Shaped Kernels for Generalized Time-Frequency Representations of Nonstationary Signals," IEEE Trans. Acoustics, Speech, and Signal Processing, vol. 38, no. 7, pp. 1084-1091, July 1990.
[20] N.E. Huang, Z. Shen, S.R. Long, M.C. Wu, H.H. Shih, Q. Zheng, N.C. Yen, C.C. Tung, and H. Liu, "The Empirical Mode Decomposition and the Hilbert Spectrum for Nonlinear and Non-Stationary Time Series Analysis," Proc. Royal Soc. London. Series A: Math., Physical and Eng. Sciences, vol. 454, no. 1971, pp. 903-995, 1998.
[21] E. Niedermeyer and F.H. Lopes da Silva, Electroencephalography: Basic Principles, Clinical Applications and Related Fields. Williams and Wilkins, 1998.
[22] S. Parker, J. Bascom, B. Rabinovitz, and D. Zellner, "Positive and Negative Hedonic Contrast with Musical Stimuli," Psychology Aesthetics, Creativity Arts, vol. 2, no. 3, pp. 171-174, 2008.
[23] D. Yao, "A Method to Standardize a Reference of Scalp EEG Recordings to a Point at Infinity," Physiological Measurement, vol. 22, pp. 693-711, 2001.
[24] P. Ekman, "An Argument for Basic Emotions," Cognition and Emotion, vol. 6, nos. 3/4, pp. 169-200, 1992.
[25] R. Gjerdingen and D. Perrott, "Scanning the Dial: The Rapid Recognition of Music Genres," J. New Music Research, vol. 37, no. 2, pp. 93-100, 2008.
[26] E.G. Schellenberg, E.G. Iverson, and M.C. McKinnon, "Name That Tune: Identifying Popular Recordings from Brief Excerpts," Psychonomic Bull., vol. 6, no. 4, pp. 641-646, 1999.
[27] T.M. Cover and P.E. Hart, "Nearest Neighbor Pattern Classification," IEEE Trans. Information Theory, vol. IT-13, no. 1, pp. 21-27, Jan. 1967.
[28] J. Shawe-Taylor and N. Cristianini, An Introduction to Support Vector Machines and Other Kernel-Based Learning Methods. Cambridge Univ. Press, 2000.
[29] R. Kohavi, "A Study of Cross-Validation and Bootstrap for Accuracy Estimation and Model Selection," Proc. 14th Int'l Conf. Artificial Intelligence, pp. 1137-1143, 1995.
[30] T. Schäfer and P. Sedlmeier, "Does the Body Move the Soul? The Impact of Arousal on Music Preference," Music Perception, vol. 29, no. 1, pp. 37-50, 2011.
[31] C.S. Pereira, J. Texeira, J. Figueiredo, J. Xavier, S.L. Castro, and E. Bratitico, "Music and Emotions in the Brain: Familiarity Matters," PLos ONE, vol. 6, no. 11, article e27241, 2011.
[32] L.I. Aftanas, N.V. Reva, A.A. Varlamov, S.V. Pavlov, and V.P. Makhnev, "Analysis of Evoked EEG Synchronization and Desynchronization in Conditions of Emotional Activation in Humans: Temporal and Topographic Characteristics," Neuroscience Behavioral Physiology, vol. 34, no. 8, pp. 859-867, 2004.
[33] S.J. Wilson and M.M. Saling, "Contributions of the Right and Left Mesial Temporal Lobes to Music Memory: Evidence from Melodic Learning Difficulties," Music Perception, vol. 25, no. 4, pp. 303-314, 2008.
[34] E. Baskar, C. Basar-Eroglu, S. Karakas, and M. Schürmann, "Oscillatory Brain Theory: A New Trend in Neuroscience," IEEE Eng. Medicine and Biology, vol. 18, no. 3, pp. 56-66, May/June 1999.
13 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool