The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.09 - September (2011 vol.17)
pp: 1234-1244
Rolf Nordahl , Aalborg University Copenhagen, Ballerup
Luca Turchet , Aalborg University Copenhagen, Ballerup
Stefania Serafin , Aalborg University Copenhagen, Ballerup
ABSTRACT
We propose a system that affords real-time sound synthesis of footsteps on different materials. The system is based on microphones, which detect real footstep sounds from subjects, from which the ground reaction force (GRF) is estimated. Such GRF is used to control a sound synthesis engine based on physical models. Two experiments were conducted. In the first experiment, the ability of subjects to recognize the surface they were exposed to was assessed. In the second experiment, the sound synthesis engine was enhanced with environmental sounds. Results show that, in some conditions, adding a soundscape significantly improves the recognition of the simulated environment.
INDEX TERMS
Sound and music computing, walking, surface simulation, soundscape rendering.
CITATION
Rolf Nordahl, Luca Turchet, Stefania Serafin, "Sound Synthesis and Evaluation of Interactive Footsteps and Environmental Sounds Rendering for Virtual Reality Applications", IEEE Transactions on Visualization & Computer Graphics, vol.17, no. 9, pp. 1234-1244, September 2011, doi:10.1109/TVCG.2011.30
REFERENCES
[1] J. Adrien, The Missing Link: Modal Synthesis. MIT Press, 1991.
[2] F. Avanzini and D. Rocchesso, "Modeling Collision Sounds: Nonlinear Contact Force," Proc. COST-G6 Conf. Digital Audio Effects (DAFx-01), pp. 61-66, 2001.
[3] A. Benbasat, S. Morris, and J. Paradiso, "A Wireless Modular Sensor Architecture and Its Application in On-Shoe Gait Analysis," Proc. IEEE Sensors, vol. 2, 2003.
[4] B. Blesser and L. Salter, Spaces Speak, Are You Listening?: Experiencing Aural Architecture. MIT Press, 2006.
[5] M. Chion and C. Gorbman, Audio-Vision, Nathan, 1990.
[6] P. Chueng and P. Marsden, "Designing Auditory Spaces to Support Sense of Place: The Role of Expectation," Proc. Computer Supported Cooperative Work Workshop: The Role of Place in Shaping Virtual Community, 2002.
[7] P. Cook, "Physically Informed Sonic Modeling (PhISM): Synthesis of Percussive Sounds," Computer Music J., vol. 21, no. 3, pp. 38-49, 1997.
[8] P. Cook, "Modeling Bill's Gait: Analysis and Parametric Synthesis of Walking Sounds," Proc. 22nd Int'l Conf. Virtual, Synthetic, and Entertainment Audio, pp. 73-78, 2002.
[9] P. Cook, Real Sound Synthesis for Interactive Applications. AK Peters, Ltd., 2002.
[10] P. Dupont, V. Hayward, B. Armstrong, and F. Altpeter, "Single State Elastoplastic Friction Models," IEEE Trans. Automatic Control, vol. 47, no. 5, pp. 787-792, May 2002.
[11] A. Farnell, "Marching Onwards: Procedural Synthetic Footsteps for Video Games and Animation," Proc. Pure Data Convention, 2007.
[12] F. Fontana and R. Bresin, "Physics-Based Sound Synthesis and Control: Crushing, Walking and Running by Crumpling Sounds," Proc. Colloquium Musical Informatics, pp. 109-114, May 2003.
[13] J. Freeman and J. Lessiter, "Hear There & Everywhere: the Effects of Multi-channel Audio on Presence," Proc. Int'l Conf. Auditory Display (ICAD '01), pp. 231-234, 2001.
[14] W. Gaver, "What in the World Do We Hear?: An Ecological Approach to Auditory Event Perception," Ecological Psychology, vol. 5, no. 1, pp. 1-29, 1993.
[15] B. Giordano and S. Mcadams, "Material Identification of Real Impact Sounds: Effects of Size Variation in Steel, Glass, Wood, and Plexiglass Plates," J. Acoustical Soc. Am., vol. 119, pp. 1171-1181, 2006.
[16] B. Giordano, S. Mcadams, Y. Visell, J. Cooperstock, H. Yao, and V. Hayward, "Nonvisual Identification of Walking Grounds," J. Acoustical Soc. Am., vol. 123, no. 5, p. 3412, 2008.
[17] K.H. Hunt and F.R.E. Crossley, "Coefficient of Restitution Interpreted as Damping in Vibroimpact," J. Applied Mechanics, vol. 42, no. 2, pp. 440-445, 1975.
[18] B. Kapralos, D. Zikovitz, M. Jenkin, and L. Harris, "Auditory Cues in the Perception of Self-Motion," Proc. 116th AES Convention, 2004.
[19] R. Klatzky, D. Pai, and E. Krotkov, "Perception of Material from Contact Sounds," Presence: Teleoperators and Virtual Environments, vol. 9, no. 4, pp. 399-410, 2000.
[20] P. Larsson, D. Västfjäll, and M. Kleiner, "Perception of Self-Motion and Presence in Auditory Virtual Environments," Proc. Seventh Ann. Workshop Presence, pp. 252-258, 2004.
[21] A. Law, B. Peck, Y. Visell, P. Kry, and J. Cooperstock, "A Multi-Modal Floor-Space for Experiencing Material Deformation Underfoot in Virtual Reality," Proc. IEEE Int'l Workshop Haptic Audio Visual Environments (HAVE '08) and Games, pp. 126-131, 2008.
[22] X. Li, R. Logan, and R. Pastore, "Perception of Acoustic Source Characteristics: Walking Sounds," J. Acoustical Soc. Am., vol. 90, pp. 3036-3049, 1991.
[23] R. Nordahl, "Increasing the Motion of Users in Photorealistic Virtual Environments by Utilizing Auditory Rendering of the Environment and Ego-Motion," Proc. Presence, pp. 57-62, 2006.
[24] R. Nordahl, A. Berrezag, S. Dimitrov, L. Turchet, V. Hayward, and S. Serafin, "Preliminary Experiment Combining Virtual Reality Haptic Shoes and Audio Synthesis," Proc. Int'l Conf. Haptics: Generating and Perceiving Tangible Sensations, pp. 123-129, 2010.
[25] R. Nordahl, S. Serafin, and L. Turchet, "Sound Synthesis and Evaluation of Interactive Footsteps for Virtual Reality Applications," Proc. IEEE Virtual Reality Conf., 2010.
[26] S. O'Modhrain and G. Essl, "Pebblebox and Crumblebag: Tactile Interfaces for Granular Synthesis," Proc. Conf. New Interfaces for Musical Expression (NIME '04), pp. 74-79, 2004.
[27] A. Papoulis, Probability, Random Variables, and Stochastic Processes, second ed. McGraw-Hill, 1984.
[28] J. Paradiso, K. Hsiao, and E. Hu, "Interactive Music for Instrumented Dancing Shoes," Proc. Int'l Computer Music Conf., pp. 453-456, 1999.
[29] R. Pastore, J. Flint, J. Gaston, and M. Solomon, "Auditory Event Perception: The Source-Perception Loop for Posture in Human Gait," Perception and Psychophysics, vol. 70, no. 1, pp. 13-29, 2008.
[30] L. Peltola, C. Erkut, P. Cook, and V. Valimaki, "Synthesis of Hand Clapping Sounds," IEEE Trans. Audio, Speech, and Language Processing, vol. 15, no. 3, pp. 1021-1029, Mar. 2007.
[31] S.S.R. Nordahl and L. Turchet, "Sound Synthesis and Evaluation of Interactive Footsteps for Virtual Reality Applications," Proc. IEEE Virtual Reality Conf., 2010.
[32] N. Raghuvanshi and M. Lin, "Interactive Sound Synthesis for Large Scale Environments," Proc. Symp. Interactive 3D Graphics and Games, pp. 101-108, 2006.
[33] R. SandersJr., "The Effect of Sound Delivery Methods on a Users Sense of Presence in a Virtual Environment," PhD thesis, Naval Post Graduate School, 2002.
[34] R. Schafer, The Tuning of the World. Alfred A. K nopf, 1977.
[35] S. Serafin, L. Turchet, and R. Nordahl, "Extraction of Ground Reaction Forces for Real-Time Synthesis of Walking Sounds," Proc. Audio Mostly Conf., 2009.
[36] R. Storms and M. Zyda, "Interactions in Perceived Quality of Auditory-Visual Displays," Presence: Teleoperators and Virtual Environments, vol. 9, no. 6, pp. 557-580, 2000.
[37] L. Turchet, S. Serafin, S. Dimitrov, and R. Nordahl, "Conflicting Audio-Haptic Feedback in Physically Based Simulation of Walking Sounds," Proc. Int'l Conf. Haptic and Audio Interaction Design (HAID '10), p. 97, Sept. 2010.
[38] A. Väljamäe, P. Larsson, D. Västfjäll, and M. Kleiner, "Travelling without Moving: Auditory Scene Cues for Translational Self-Motion," Proc. Int'l Conf. Auditory Display (ICAD '05), 2005.
[39] K. Van Den Doel, P. Kry, and D. Pai, "FoleyAutomatic: Physically-Based Sound Effects for Interactive Simulation and Animation," Proc. 28th Ann. Conf. Computer Graphics and Interactive Techniques, pp. 537-544, 2001.
[40] D. Västfjäll, "The Subjective Sense of Presence, Emotion Recognition, and Experienced Emotions in Auditory Virtual Environments," CyberPsychology & Behavior, vol. 6, no. 2, pp. 181-188, 2003.
6 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool