The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.03 - July-September (2012 vol.3)
pp: 335-348
G. Lemaitre , Fac. di design e arti, Univ. Iuav di Venezia, Venezia, Italy
O. Houix , Equipe Perception et Design Sonores, STMS-IRCAM, Paris, France
P. Susini , Equipe Perception et Design Sonores, STMS-IRCAM, Paris, France
Y. Visell , Inst. des Syste`mes Intelligents et de Robot. (ISIR), Univ. Pierre et Marie Curie, Paris, France
K. Franinovic , Zurich Univ. of the Arts, Zurich, Switzerland
ABSTRACT
This paper reports on emotions felt by users manipulating a computationally and acoustically augmented artifact. Prior studies have highlighted systematic relationships between acoustic features and emotions felt when individuals are passively listening to sounds. However, during interaction with real or computationally augmented artifacts, acoustic feedback results from users' active manipulation of the artifact. In such a setting, both sound and manipulation can contribute to the emotions that are elicited. We report on a set of experimental studies that examined the respective roles of sound and manipulation in eliciting emotions from users. The results show that, while the difficulty of the manipulation task predominated, the acoustical qualities of the sounds also influenced the feelings reported by participants. When the sounds were embedded in an interface, their pleasantness primarily influenced the valence of the users' feelings. However, the results also suggested that pleasant sounds made the task slightly easier, and left the users feeling more in control. The results of these studies provide guidelines for the measurement and design of affective aspects of sound in computationally augmented artifacts and interfaces.
INDEX TERMS
human computer interaction, auditory displays, augmented reality, behavioural sciences computing, active artifact manipulation, elicited feelings, auditory feedback, computationally augmented artifact, emotions, acoustically augmented artifact, acoustic features, Appraisal, Glass, Noise, Computational modeling, Psychoacoustic models, Context, Acoustics, manipulation, Computationally augmented interface, emotion, auditory displays, acoustical features
CITATION
G. Lemaitre, O. Houix, P. Susini, Y. Visell, K. Franinovic, "Feelings Elicited by Auditory Feedback from a Computationally Augmented Artifact: The Flops", IEEE Transactions on Affective Computing, vol.3, no. 3, pp. 335-348, July-September 2012, doi:10.1109/T-AFFC.2012.1
REFERENCES
[1] R. Schirmacher, R. Lippold, F. Steinbach, and F. Walter, “Praktische Aspekte Beim Einsatz Von ANC-Systemen in PKW,” 2007.
[2] D. Norman, “Emotion and Design: Attractive Things Work Better,” Interactions, vol. 9, no. 4, pp. 36-42, 2002.
[3] D. Norman, The Design of Everyday Things. Basic Books, 2002.
[4] D. Norman, Emotional Design: Why We Love (or Hate) Everyday Things. Basic Civitas Books, 2004.
[5] P. Desmet, “A Multilayered Model of Product Emotions,” The Design J., vol. 6, no. 2, pp. 4-13, 2003.
[6] F. Spillers, “Emotion as a Cognitive Artifact and the Design Implications for Products that Are Perceived as Pleasurable,” Proc. Fourth Conf. Design and Emotion '04, 2004.
[7] N. Tractinsky and D. Zmiri, “Exploring Attributes of Skins as Potential Antecedents of Emotion in HCI,” Aesthetic Computing, pp. 405-422, MIT Press, 2006.
[8] A. Rafaeli and I. Vilnai-Yavetz, “Instrumentality, Aesthetics and Symbolism of Physical Artifacts as Triggers of Emotion,” Theoretical Issues in Ergonomics Science, vol. 5, no. 1, pp. 91-112, 2004.
[9] R.W. Picard, Affective Computing. The MIT Press, 2000.
[10] M. Chion, C. Gorbman, and W. Murch, Audio-Vision: Sound on Screen. Columbia Univ. Press, 1994.
[11] D. Blumstein, R. Davitian, and P. Kaye, “Do Film Soundtracks Contain Nonlinear Analogues to Influence Emotion?” Biology Letters, vol. 6, pp. 751-754, 2010.
[12] A. Ortony and T.J. Turner, “What's Basic about Basic Emotions?” Psychological Rev., vol. 97, no. 3, pp. 315-331, 1990.
[13] J.A. Russell and L.F. Barrett, “Core Affect, Prototypical Emotional Episodes, and Other Things Called Emotion: Dissecting the Elephant,” J. Personality and Social Psychology, vol. 76, no. 5, pp. 805-819, 1999.
[14] J.A. Russell, “Core Affect and the Psychological Construction of Emotion,” Psychological Rev., vol. 110, no. 1, pp. 145-172, 2003.
[15] J.A. Russell, J.-A. Bachorowski, and J.-M. Fernández-Dols, “Facial and Vocal Expressions of Emotions,” Ann. Rev. of Psychology, vol. 54, pp. 329-349, 2003.
[16] K.R. Scherer, “What Are Emotions? And How Can They Be Measured?” Social Science Information, vol. 44, no. 4, pp. 695-729, 2005.
[17] P.N. Juslin and D. Västfjäll, “Emotional Responses to Music: The Need to Consider Underlying Mechanisms,” Behavioral and Brain Sciences, vol. 31, pp. 559-621, 2008.
[18] K.R. Scherer, “The Dynamic Architecture of Emotion: Evidence for the Component Process Model,” Cognition and Emotion, vol. 23, no. 7, pp. 1307-1351, 2009.
[19] K.R. Scherer, “Appraisal Theory,” Handbook of Cognition and Emotions, chapter 30, T. Dalgleish and M. Power, eds., pp. 637-663, John Wiley and Sons, 1999.
[20] P.C. Ellsworth and K.R. Scherer, “Appraisal Processes in Emotion,” Handbook of Affective Sciences, chapter 29, R.J. Davidson, K.R. Scherer, and H.H. Goldsmith, eds., pp. 572-595, Oxford Univ. Press, 2003.
[21] D. Grandjean and K.R. Scherer, “Unpacking the Cognitive Architecture of Emotion Processes,” Emotion, vol. 8, no. 3, pp. 341-351, 2008.
[22] M.M. Bradley and P.J. Lang, “Affective Reactions to Acoustic Stimuli,” Psychophysiology, vol. 37, pp. 204-215, 2000.
[23] H. Schlosberg, “Three Dimensions of Emotion,” The Psychological Rev., vol. 61, no. 2, pp. 81-88, 1954.
[24] C.E. Osgood, “Dimensionality of the Semantic Space for Communication via Facial Expressions,” Scandinavian J. Experimental Psychology, vol. 7, pp. 1-30, 1966.
[25] A. Mehrabian, “A Semantic Space for Nonverbal Behavior,” J. Consulting and Clinical Psychology, vol. 35, no. 2, pp. 248-257, 1970.
[26] M.M. Bradley and P.J. Lang, “Measuring Emotion: The Self-Assessment Manikin and the Semantic Differential,” J. Behavior Therapy and Experimental Psychiatry, vol. 25, pp. 49-59, 1994.
[27] J.A. Russell, “A Circumplex Model of Affect,” J. Personality and Social Psychology, vol. 39, no. 6, pp. 1161-1178, 1980.
[28] J.R.J. Fontaine, K.R. Scherer, E.B. Roesch, and P.C. Ellsworth, “The World of Emotions Is Not Two-Dimensional,” Psychological Science, vol. 18, no. 12, pp. 1050-1057, 2007.
[29] K.R. Scherer, E.S. Dan, and A. Flykt, “What Determines a Feeling's Position in Affective Space? A Case for Appraisal,” Cognition and Emotion, vol. 20, no. 1, pp. 92-113, 2006.
[30] D. Västfjäll and M. Kleiner, “Emotion in Product Sound Design,” Proc. les Journées du Design Sonore, 2002.
[31] G. Lemaitre, P. Susini, S. Winsberg, B. Letinturier, and S. McAdams, “The Sound Quality of Car Horns: A Psychoacoustical Study of Timbre,” Acta Acustica United with Acustica, vol. 93, no. 3, pp. 457-468, 2007.
[32] D.A. Norman, Emotional Design: Why We Love (or Hate) Everyday Things. Basic Books, 2004.
[33] N. Tractinsky, A. Katz, and D. Ikar, “What Is Beautiful Is Usable,” Interacting with Computers, vol. 13, pp. 127-145, 2000.
[34] R. Guski, I. Felscher-Suhr, and R. Schuemer, “The Concept of Noise Annoyance: How International Experts See It,” J. Sound and Vibration, vol. 223, no. 4, pp. 513-527, 1999.
[35] D. Västfjäll, M.-A. Gulbol, M. Kleiner, and T. Gärling, “Affective Evaluations of and Reactions to Exterior and Interior Vehicle Auditory Quality,” J. Sound and Vibration, vol. 255, no. 3, pp. 501-518, 2002.
[36] E. Zwicker and H. Fastl, Psychoacoustics Facts and Models. Springer Verlag, 1990.
[37] D. Västfjäll, T. Gärling, and M. Kleiner, “Does It Make You Happy Feeling This Way? A Core Affect Account of Preference for Current Mood,” J. Happiness Studies, vol. 2, pp. 337-354, 2001.
[38] D. Västfjäll, M. Kleiner, and T. Gärling, “Affective Reactions to Interior Aircraft Sounds,” Acta Acustica United with Acustica, vol. 89, pp. 693-701, 2003.
[39] D. Västfjäll, “Contextual Influences of Sound Quality Evaluation,” Acta Acustica United with Acustica, vol. 90, pp. 1029-1036, 2004.
[40] P. Susini, S. McAdams, S. Winsberg, I. Perry, S. Vieillard, and X. Rodet, “Characterizing the Sound Quality of Air-Conditioning Noise,” Applied Acoustics, vol. 65, no. 8, pp. 763-790, 2004.
[41] D. Fallman and J. Waterworth, “Dealing with User Experience and Affective Evaluation in HCI Design: A Repertory Grid Approach,” Proc. Computer-Human Interface Workshop Evaluating Affective Interfaces, Apr. 2005.
[42] B. Cahour, P. Salembier, C. Brassac, J.L. Bouraoui, B. Pachoud, P. Vermersch, and M. Zouinar, “Methodologies for Evaluating the Affective Experience of a Mediated Interaction,” Proc. Computer-Human Interface Workshop Evaluating Affective Interfaces, 2005.
[43] C. Swindells, K.E. MacLean, K.S. Booth, and M. Meitner, “A Case-Study of Affect Measurement Tools for Physical User Interface Design,” Proc. Conf. Graphic Interface, 2006.
[44] C. Wiberg, “Affective Computing versus Usability,” Proc. Computer-Human Interface Workshop Evaluating Affective Interfaces, 2005.
[45] R.W. Picard and S.B. Daily, “Evaluating Affective Interactions: Alternatives to Asking What Users Feel,” Proc. Computer-Human Interface Workshop Evaluating Affective Interfaces, 2005.
[46] S. Kaiser, T. Werle, and P. Edwards, “Multimodal Emotion Measurement in an Interactive Computer-Game: A Pilot-Study,” Proc. Eighth Conf. Int'l Soc. for Research on Emotions, N.H. Frijda, ed., 1994.
[47] A. Kappas, “BRIEF REPORT Don't Wait for the Monsters to Get You: A Video Game Task to Manipulate Appraisals in Real Time,” Cognition and Emotion, vol. 13, no. 1, pp. 119-124, 1999.
[48] T. Johnstone, C.M. van Reekum, T. Bänziger, K. Hird, K. Kirsner, and K.R. Scherer, “The Effects of Difficulty and Gain versus Loss on Vocal Physiology and Acoustics,” Psychophysiology, vol. 44, pp. 827-837, 2007.
[49] N. Ravaja, T. Saari, J. Laarni, K. Kallinen, M. Salminen, J. Holopainen, and A. Järvinen, “The Psychophysiology of Video Gaming: Phasic Emotional Response to Game Events,” Proc. Digitial Game Research Assoc. Conf.: Changing Views—World in Play, 2005.
[50] N. Ravaja, M. Turpeinen, T. Saari, S. Puttonen, and L. Keltikangas-Järvinen, “The Psychophysiology of James Bond: Phasic Emotional Responses to Violent Video Games,” Emotions, vol. 8, no. 1, pp. 114-120, 2008.
[51] C. van Reekum, T. Johnstone, R. Banse, A. Etter, T. Wehrle, and K.R. Scherer, “Psychophysiological Responses to Appraisal Dimensions in a Computer Game,” Cognition and Emotion, vol. 18, no. 5, pp. 663-688, 2004.
[52] T. Johnstone, C.M. van Reekum, K. Hird, K. Kirsner, and K.R. Scherer, “Affective Speech Elicited with a Computer Game,” Emotion, vol. 5, no. 4, pp. 513-518, 2005.
[53] M.-H. Lemaire, “Blurred and Playful Intersections: Karmen Franinović's flo)(ps,” Wi: J. Mobile Media, vol. 10, http://wi.hexagram.ca?p=51, 2009.
[54] G. Peeters, “A Large Set of Audio Features for Sound Description (Similarity and Classification) in the CUIDADO Project,” Institut de Recherche et de Coordination Acoustique Musique (IRCAM), Paris, France, Cuidado Projet Report, 2004.
[55] N. Misdariis, A. Minard, P. Susini, G. Lemaitre, S. McAdams, and E. Parizet, “Environmental Sound Perception: Meta-Description and Modeling Based on Independent Primary Studies,” EURASIP J. Speech, Audio, and Music Processing, vol. 2010, article 6, 2010.
[56] R.B. Cattell and I.H. Scheier, The Meaning and Measurement of Neuroticism and Anxiety. Ronald, 1961.
[57] B.K. Smith, “PsiExp: An Environment for Psychoacoustic Experimentation Using the IRCAM Musical Workstation,” Proc. Soc. for Music Perception and Cognition Conf., 1995.
[58] P.J. Lang, M.M. Bradley, and B.N. Cuthbert, “International Affective Picture System (IAPS): Affective Ratings of Pictures and Instruction Manual,” technical report A-7, Univ. of Florida, 2008.
[59] H. Abdi, “Bonferroni and Šidák Corrections for Multiple Comparisons,” Encyclopedia of Measurement and Statistics, N. Salkind, ed., pp. 103-107, 2007.
[60] G. Lemaitre, O. Houix, Y. Visell, K. Franinović, N. Misdariis, and P. Susini, “Toward the Design and Evaluation of Continuous Sound in Tangible Interfaces: The Spinotron,” Int'l J. Human-Computer Studies, special issue on sonic interaction design, vol. 67, pp. 976-993, 2009.
[61] P.G. Cheatham and C.T. White, “Temporal Numerosity: III. Auditory Perception of Number,” J. Experimental Psychology, vol. 47, no. 6, pp. 425-428, 1954.
[62] K.R. Scherer, “Which Emotions Can Be Induced by Music? What Are the Underlying: Mechanisms? And How Can We Measure Them?” J. New Music Research, vol. 33, no. 3, pp. 239-251, 2004.
[63] P. Susini, N. Misdariis, G. Lemaitre, and O. Houix, “Naturalness Influences the Perceived Usability and Pleasantness of an Interface's Sonic Feedback,” J. Multimodal User Interfaces, submitted for publication, 2011.
[64] S. Kumar, H.M. Forster, P. Bailey, and T.D. Griffiths, “Mapping Unpleasantness of Sounds to Their Auditory Representation,” J. Acoustical Soc. of Am., vol. 124, no. 6, pp. 3810-3817, 2008.
[65] K.R. Scherer, R. Banse, H.G. Wallbott, and T. Goldbeck, “Vocal Cues in Emotion Encoding and Decoding,” Motivation and Emotion, vol. 15, pp. 123-148, 1991.
[66] K.R. Scherer, T. Johnstone, and G. Klasmeyer, “Vocal Expression of Emotion,” Handbook of Affective Sciences, chapter 23, R.J. Davidson, K.R. Scherer, and H.H. Goldsmith, eds., pp. 433-456, Oxford Univ. Press, 2003.
[67] P. Juslin and R. Timmers, “Expression and Communication of Emotion in Music Performance,” Handbook of Music and Emotion: Theory, Research, and Applications, P. Juslin and J. Sloboda, eds., pp. 453-489, Oxford Univ. Press, 2010.
[68] K.R. Scherer and J.S. Oshinsky, “Cue Utilization in Emotion Attribution from Auditory Stimuli,” Motivation and Emotion, vol. 1, no. 4, pp. 331-346, 1977.
[69] P. Susini, N. Misdariis, O. Houix, and G. Lemaitre, “Does a ‘Natural’ Feedback Affect Perceived Usability and Emotion in the Context of Use of an ATM?” Proc. Sound and Music Computing Conf., 2009.
45 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool