Issue No.02 - April-June (2012 vol.3)
M. Leman , Dept. of Arts & Philos., Ghent Univ., Ghent, Belgium
The widespread availability of digitized music collections and mobile music players have enabled us to listen to music during many of our daily activities, such as physical exercise, commuting, relaxation, and many people enjoy this. A practical problem that comes along with the wish to listen to music is that of music retrieval, the selection of desired music from a music collection. In this paper, we propose a new approach to facilitate music retrieval. Modern smart phones are commonly used as music players and are already equipped with inertial sensors that are suitable for obtaining motion information. In the proposed approach, emotion is derived automatically from arm gestures and is used to query a music collection. We derive predictive models for valence and arousal from empirical data, gathered in an experimental setup where inertial data recorded from arm movements are coupled to musical emotion. Part of the experiment is a preliminary study confirming that human subjects are generally capable of recognizing affect from arm gestures. Model validation in the main study confirmed the predictive capabilities of the models.
smart phones, gesture recognition, mobile computing, music, query processing, sensors, musical emotion, e-motion-based music retrieval, affective gesture recognition, digitized music collections, mobile music players, physical exercise, commuting, relaxation, smart phones, inertial sensors, motion information, arm gestures, predictive models, music collection query, Acceleration, Observers, Humans, Feature extraction, Predictive models, Sensors, Visualization, human computer interfaces., Affect detection, expressive gestures, music retrieval
M. Leman, "Toward E-Motion-Based Music Retrieval a Study of Affective Gesture Recognition", IEEE Transactions on Affective Computing, vol.3, no. 2, pp. 250-259, April-June 2012, doi:10.1109/T-AFFC.2011.39