The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.02 - April-June (2012 vol.3)
pp: 250-259
M. Grachten , Dept. of Comput. Perception, Johannes Kepler Univ. Linz, Linz, Austria
D. Amelynck , Dept. of Arts & Philos., Ghent Univ., Ghent, Belgium
L. van Noorden , Dept. of Arts & Philos., Ghent Univ., Ghent, Belgium
M. Leman , Dept. of Arts & Philos., Ghent Univ., Ghent, Belgium
ABSTRACT
The widespread availability of digitized music collections and mobile music players have enabled us to listen to music during many of our daily activities, such as physical exercise, commuting, relaxation, and many people enjoy this. A practical problem that comes along with the wish to listen to music is that of music retrieval, the selection of desired music from a music collection. In this paper, we propose a new approach to facilitate music retrieval. Modern smart phones are commonly used as music players and are already equipped with inertial sensors that are suitable for obtaining motion information. In the proposed approach, emotion is derived automatically from arm gestures and is used to query a music collection. We derive predictive models for valence and arousal from empirical data, gathered in an experimental setup where inertial data recorded from arm movements are coupled to musical emotion. Part of the experiment is a preliminary study confirming that human subjects are generally capable of recognizing affect from arm gestures. Model validation in the main study confirmed the predictive capabilities of the models.
INDEX TERMS
smart phones, gesture recognition, mobile computing, music, query processing, sensors, musical emotion, e-motion-based music retrieval, affective gesture recognition, digitized music collections, mobile music players, physical exercise, commuting, relaxation, smart phones, inertial sensors, motion information, arm gestures, predictive models, music collection query, Acceleration, Observers, Humans, Feature extraction, Predictive models, Sensors, Visualization, human computer interfaces., Affect detection, expressive gestures, music retrieval
CITATION
M. Grachten, D. Amelynck, L. van Noorden, M. Leman, "Toward E-Motion-Based Music Retrieval a Study of Affective Gesture Recognition", IEEE Transactions on Affective Computing, vol.3, no. 2, pp. 250-259, April-June 2012, doi:10.1109/T-AFFC.2011.39
REFERENCES
[1] R.A. Calvo and S. D'Mello, "Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications," IEEE Trans. Affective Computing, vol. 1, no. 1, pp. 18-37, Jan.-June 2010.
[2] P. Juslin and D. Västfjäll, "Emotional Responses to Music: The Need to Consider Underlying Mechanisms," Behavioral and Brain Sciences, vol. 31, no. 05, pp. 559-575, 2008.
[3] G. Varni, G. Volpe, and A. Camurri, "A System for Real-Time Multimodal Analysis of Nonverbal Affective Social Interaction in User-Centric Media," IEEE Trans. Multimedia, vol. 12, no. 6, pp. 576-590, Oct. 2010.
[4] M. Leman, Embodied Music Cognition and Mediation Technology, pp. 189-197. The MIT Press, 2008.
[5] T. Eerola, O. Lartillot, and P. Toiviainen, "Prediction of Multidimensional Emotional Ratings in Music from Audio Using Multivariate Regression Models," Proc. 10th Int'l Soc. for Music Information Retrieval Conf., pp. 621-626, 2009.
[6] E. Schmidt and Y. Kim, "Projection of Acoustic Features to Continuous Valence-Arousal Mood Labels via Regression," Proc. 10th Int'l Soc. for Music Information Retrieval Conf., 2009.
[7] E. Schubert, "Modeling Perceived Emotion with Continuous Musical Features," Music Perception, vol. 21, no. 4, pp. 561-585, 2004.
[8] K.R. Scherer, "Why Music does not Produce Basic Emotions: A Plea for a New Approach to Measuring Emotional Effects of Music," Proc. Stockholm Music Acoustics Conf., pp. 25-28, 2003.
[9] J. Panksepp and G. Bernatzky, "Emotional Sounds and the Brain: The Neuro-Affective Foundations of Musical Appreciation," Behavioural Processes, vol. 60, no. 2, pp. 133-155, 2002.
[10] V.J. Konečni, A. Brown, and R.A. Wanic, "Comparative Effects of Musicand Recalled Life-Events on Emotional State," Psychology of Music, vol. 36, no. 3, pp. 289-308, 2007.
[11] J. Russell, "A Circumplex Model of Affect," J. Personality and Social Psychology, vol. 39, no. 6, pp. 1161-1178, 1980.
[12] T. Eerola and J. Vuoskoski, "A Comparison of the Discrete and Dimensional Models of Emotion in Music," Psychology of Music, vol. 39, no. 1, p. 18, 2011.
[13] S. Livingstone, R. M"uhlberger, A. Brown, and A. Loch, "Controlling Musical Emotionality: An Affective Computational Architecture for Influencing Musical Emotions," Digital Creativity, vol. 18, no. 1, pp. 43-53, 2007.
[14] P. Ekman and H. Oster, "Facial Expressions of Emotion," Ann. Rev. Psychology, vol. 30, no. 1, pp. 527-554, 1979.
[15] H. Wallbott, "Bodily Expression of Emotion," European J. Social Psychology, vol. 28, no. 6, pp. 879-896, 1998.
[16] C. Roether, L. Omlor, A. Christensen, and M. Giese, "Critical Features for the Perception of Emotion from Gait," J. Vision, vol. 9, no. 6, pp. 1-32, 2009.
[17] A. Atkinson, W. Dittrich, A. Gemmell, and A. Young, "Emotion Perception from Dynamic and Static Body Expressions in Point-Light and Full-Light Displays," Perception, vol. 33, pp. 717-746, 2004.
[18] F. Pollick, H. Paterson, A. Bruderlin, and A. Sanford, "Perceiving Affect from Arm Movement," Cognition, vol. 82, no. 2, pp. B51-B61, 2001.
[19] S. Brownlow, A. Dixon, C. Egbert, and R. Radcliffe, "Perception of Movement and Dancer Characteristics from Point-Light Displays of Dance," The Psychological Record, vol. 47, no. 3, pp. 411-421, 1997.
[20] A. Camurri, I. Lagerlöf, and G. Volpe, "Recognizing Emotion from Dance Movement: Comparison of Spectator Recognition and Automated Techniques," Int'l J. Human-Computer Studies, vol. 59, nos. 1/2, pp. 213-225, 2003.
[21] B. Vines, M. Wanderley, C. Krumhansl, R. Nuzzo, and D. Levitin, "Performance Gestures of Musicians: What Structural and Emotional Information Do They Convey?" Gesture-Based Comm. in Human-Computer Interaction, vol. 2915, pp. 3887-3887, 2004.
[22] Lesaffre, "Music Information Retrieval. Conceptual Framework, Annotation and User Behaviour," unpublished PhD, 2005.
[23] M. Zentner, D. Grandjean, and K. Scherer, "Emotions Evoked by the Sound of Music: Characterization, Classification, and Measurement," Emotion, vol. 8, no. 4, pp. 494-521, 2008.
[24] T. Hastie, R. Tibshirani, J. Friedman, and J. Franklin, "The Elements of Statistical Learning: Data Mining, Inference and Prediction," The Math. Intelligencer, vol. 27, no. 2, pp. 43-94, 2005.
[25] N. Brace, R. Kemp, and R. Snelgar, SPSS for Psychologists. Palgrave, 2003.
[26] E. Bigand, S. Vieillard, F. Madurell, J. Marozeau, and A. Dacquet, "Multidimensional Scaling of Emotional Responses to Music: The Effect of Musical Expertise and of the Duration of the Excerpts," Cognition and Emotion, vol. 19, no. 8, pp. 1113-1139, 2005.
[27] G. Kreutz, U. Ott, D. Teichmann, P. Osawa, and D. Vaitl, "Using Music to Induce Emotions: Influences of Musical Preference and Absorption," Psychology of Music, vol. 36, no. 1, pp. 1010-1046, 2008.
[28] L. Lu, D. Liu, and H. Zhang, "Automatic Mood Detection and Tracking of Music Audio Signals," IEEE Trans. Audio, Speech, and Language Processing, vol. 14, no. 1, pp. 5-18, Jan. 2006.
[29] Y. Yang, Y. Lin, H. Cheng, and H. Chen, "Mr. Emo: Music Retrieval in the Emotion Plane," Proc. 16th ACM Int'l Conf. Multimedia, pp. 1003-1004, 2008.
[30] S. Dahl and A. Friberg, "Visual Perception of Expressiveness in Musicians' Body Movements," Music Perception, vol. 24, no. 5, pp. 433-454, 2007.
[31] R.J. Zatorre and A.R. Halpern, "Mental Concerts: Musical Imagery and Auditory Cortex," Neuron, vol. 47, no. 1, pp. 9-12, http://www.sciencedirect.com/science/article/ B6WSS-4GJTGTN-4/2084cd2d103c76453a93338ecf996b2a8 , 2005.
[32] T. Hubbard, "Auditory Imagery: Empirical Findings," Psychological Bull., vol. 136, no. 2, pp. 302-329, 2010.
[33] R. Maddock, A. Garrett, and M. Buonocore, "Posterior Cingulate Cortex Activation by Emotional Words: fMRI Evidence from a Valence Decision Task," Human Brain Mapping, vol. 18, no. 1, pp. 30-41, 2003.
[34] J. Wagner, E. André, and F. Jung, "Smart Sensor Integration: A Framework for Multimodal Emotion Recognition in Real-Time," Proc. Third Int'l Conf. Affective Computing and Intelligent Interaction and Workshops, 2009.
[35] M. Lesaffre, L. De Voogdt, M. Leman, B. De Baets, H. De Meyer, and J. Martens, "How Potential Users of Music Search and Retrieval Systems Describe the Semantic Quality of Music," J. Am. Soc. for Information Science and Technology, vol. 59, no. 5, pp. 695-707, 2008.
[36] S.G.V. Vapnik and A. Smola, "Support Vector Method for Function Approximation, Regression Estimation, and Signal Processing," Neural Information Processing Systems, vol. 9, pp. 281-287, 1997.
[37] D. Verstraeten, B. Schrauwen, M. D'Haene, and D. Stroobandt, "An Experimental Unification of Reservoir Computing Methods," Neural Networks, vol. 20, no. 3, pp. 391-403, 2007.
42 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool