The Community for Technology Leaders
RSS Icon
Issue No.03 - July-Sept. (2013 vol.6)
pp: 340-351
Inwook Hwang , Dept. of Comput. Sci. & Eng., Haptics & Virtual Reality Lab., Pohang, South Korea
Hyeseon Lee , Dept. of Ind. & Manage. Eng., POSTECH, Pohang, South Korea
Seungmoon Choi , Dept. of Comput. Sci. & Eng., Haptics & Virtual Reality Lab., Pohang, South Korea
We introduce a novel dual-band haptic music player for real-time simultaneous vibrotactile playback with music in mobile devices. Our haptic music player features a new miniature dual-mode actuator that can produce vibrations consisting of two principal frequencies and a real-time vibration generation algorithm that can extract vibration commands from a music file for dual-band playback (bass and treble). The algorithm uses a "haptic equalizer" and provides plausible sound-to-touch modality conversion based on human perceptual data. In addition, we present a user study carried out to evaluate the subjective performance (precision, harmony, fun, and preference) of the haptic music player, in comparison with the current practice of bass-band-only vibrotactile playback via a single-frequency voice-coil actuator. The evaluation results indicated that the new dual-band playback outperforms the bass-only rendering, also providing several insights for further improvements. The developed system and experimental findings have implications for improving the multimedia experience with mobile devices.
Haptic interfaces, Vibrations, Actuators, Mobile handsets, Equalizers, Music, Multiple signal classification,dual-mode actuator, Haptic I/O, vibration, music, real time, dual band
Inwook Hwang, Hyeseon Lee, Seungmoon Choi, "Real-Time Dual-Band Haptic Music Player for Mobile Devices", IEEE Transactions on Haptics, vol.6, no. 3, pp. 340-351, July-Sept. 2013, doi:10.1109/TOH.2013.7
[1] A. Gallace, H.Z. Tan, and C. Spence, "The Body Surface as a Communication System: The State of the Art after 50 Years," Presence: Teleoperators and Virtual Environments, vol. 16, no. 6, pp. 655-676, 2007.
[2] I. Poupyrev, S. Maruyama, and J. Rekimoto, "Ambient Touch: Designing Tactile Interfaces for Handheld Devices," Proc. ACM Symp. User Interface Software and Technology (UIST '02), pp. 51-60, 2002.
[3] J. Rekimoto and C. Schwesig, "PreSenseII: Bi-Directional Touch and Pressure Sensing Interactions with Tactile Feedback," Proc. ACM Conf. Human Factors in Computing Systems (CHI '06), pp. 1253-1258, 2006.
[4] E. Hoggan, S.A. Brewster, and J. Johnston, "Investigating the Effectiveness of Tactile Feedback for Mobile Touchscreens," Proc. ACM Conf. Human Factors in Computing Systems (CHI '08), pp. 1573-1582, 2008.
[5] L.M. Brown and T. Kaaresoja, "Feel Who's Talking: Using Tactons for Mobile Phone Alerts," Proc. ACM Conf. Human Factors in Computing Systems (CHI '06), pp. 604-609, 2006.
[6] M. Enriquez and K.E. MacLean, "The Role of Choice in Longitudinal Recall of Meaningful Tactile Signals," Proc. IEEE Eighth Symp. Haptic Interfaces for Virtual Environment and Teleoperator Systems, pp. 49-56, 2008.
[7] J. Ryu, J. Jung, G. Park, and S. Choi, "Psychophysical Model for Vibrotactile Rendering in Mobile Devices," Presence: Teleoperators and Virtual Environments, vol. 19, no. 4, pp. 1-24, 2010.
[8] E. Hoggan and S.A. Brewster, "Crossmodal Icons for Information Display," Proc. ACM Conf. Human Factors in Computing Systems (CHI '06), pp. 857-862, 2006.
[9] E. Gunther, "Skinscape: A Tool for Composition in the Tactile Modality," Master's thesis, Massachusetts Inst. of Tech nology, 2001.
[10] A. Chang and C. O'Sullivan, "Audio-Haptic Feedback in Mobile Phones," Proc. ACM Conf. Human Factors in Computing Systems (CHI '05), pp. 1264-1267, 2005.
[11] R.H. Gault, "Progress in Experiments on Tactual Interpretation of Oral Speech," J. Abnormal Psychology and Social Psychology, vol. 19, no. 2, pp. 155-159, 1924.
[12] L.E. Bernstein, S.P. Eberhardt, and M.E. Demorest, "Single-Channel Vibrotactile Supplements to Visual Perception of Intonation and Stress," J. Acoustical Soc. of Am., vol. 85, no. 1, pp. 397-405, 1989.
[13] P.L. Brooks and B.J. Frost, "The Development and Evaluation of a Tactile Vocoder for the Profoundly Deaf," Canadian J. Public Health, vol. 77, pp. 108-113, 1986.
[14] M. Karam, F.A. Russo, and D.I. Fels, "Designing the Model Human Cochlea: An Ambient Crossmodal Audio-Tactile Display," IEEE Trans. Haptics, vol. 2, no. 3, pp. 160-169, July 2009.
[15] E.D. Scheirer, "Tempo and Beat Analysis of Acoustic Musical Signals," J. Acoustical Soc. of Am., vol. 103, no. 1, pp. 588-601, 1998.
[16] O. Mayor, "An Adaptive Real-Time Beat Tracking System for Polyphonic Pieces of Audio Using Multiple Hypotheses," Proc. MOSART Workshop Current Research Directions in Computer Music, 2001.
[17] A. Zils, F. Pachet, O. Delerue, and F. Gouyon, "Automatic Extraction of Drum Tracks from Polyphonic Music Signals," Proc. IEEE Int'l Conf. Web Delivering of Music, pp. 179-183, 2002.
[18] Y.F. Ma, X.S. Hua, L. Lu, and H.J. Zhang, "A Generic Framework of User Attention Model and Its Application in Video Summarization," IEEE Trans. Multimedia, vol. 7, no. 5, pp. 907-919, Oct. 2005.
[19] Y. Kim, J. Cha, I. Oakley, and J. Ryu, "Exploring Tactile Movies: An Initial Tactile Glove Design and Concept Evaluation," IEEE Multimedia, vol. 16, no. 3, pp. 34-44, July-Sept. 2009.
[20] M. Kim, S. Lee, and S. Choi, "Saliency-Driven Tactile Effect Authoring for Real-Time Visuotactile Feedback," Proc. Int'l Conf. Haptics: Perception, Devices, Mobility, and Comm. (EuroHaptics '12), pp. 258-269, 2012.
[21] J. Cha, Y.-S. Ho, Y. Kim, J. Ryu, and I. Oakley, "A Framework for Haptic Broadcasting," IEEE Multimedia, vol. 16, no. 3, pp. 16-27, July-Sept. 2009.
[22] J. Cha, M. Eid, and A.E. Saddik, "Touchable 3D Video System," ACM Trans. Multimedia Computing, Comm. and Applications, vol. 5, no. 4, pp. 29:1-29:25, 2009.
[23] S.J. Bensmaïa, M. Hollins, and J. Yau, "Vibrotactile Intensity and Frequency Information in the Pacinian System: A Psychophysical Model," Perception and Psychophysics, vol. 67, no. 5, pp. 828-841, 2005.
[24] G. Park and S. Choi, "Perceptual Space of Amplitude-Modulated Vibrotactile Stimuli," Proc. IEEE World Haptics Conf., pp. 59-64, 2011.
[25] S.C. Lim, K.U. Kyung, and D.S. Kwon, "Effect of Frequency Difference on Sensitivity of Beats Perception," Experimental Brain Research, vol. 211, pp. 11-19, 2012.
[26] LG Electronics, "Apparatus for Generating Vibration and Mobile Device Employing the Same," Korean Patent Application, No. 10-2009-0003264, Jan. 2009.
[27] LG Electronics, "Apparatus and Method for Generating Vibration Pattern," US Patent Application, No. 12/840,988, July 2010.
[28] L.A. Jones and S.J. Lederman, Human Hand Function. Oxford Univ. Press, 2006.
[29] Y. Yoo, I. Hwang, and S. Choi, "Consonance Perception of Vibrotactile Chords: A Feasibility Study," Proc. Sixth Int'l Conf. Haptic and Audio Interaction Design (HAID '11), pp. 42-51, 2011.
[30] S.S. Stevens, G. Stevens, and L.E. Marks, Psychophysics: Introduction to Its Perceptual, Neural, and Social Prospects. John Wiley & Sons, 1975.
[31] I. Hwang, J. Seo, M. Kim, and S. Choi, "Perceived Intensity of Tool-Transmitted Vibration: Effects of Amplitude and Frequency," Proc. IEEE Int'l Symp. Haptic Visual-Audio Environments and Games (HAVE '12), pp. 1-6, 2012.
[32] A. Israr, S. Choi, and H.Z. Tan, "Detection Threshold and Mechanical Impedance of the Hand in a Pen-Hold Posture," Proc. IEEE/RSJ Int'l Conf. Intelligent Robots and Systems (IROS '06), pp. 472-477, 2006.
[33] A. Klapuri, "Automatic Music Transcription as We Know It Today," J. New Music Research, vol. 33, no. 3, pp. 269-282, 2004.
[34] G.E. Poliner, D.P.W. Ellis, A.F. Ehmann, E. Gómez, S. Streich, and B. Ong, "Melody Transcription from Music-Audio: Approaches and Evaluation," IEEE Trans. Audio Speech and Language Processing, vol. 15, no. 4, pp. 1247-1256, May 2007.
[35] J. Lee, J. Ryu, and S. Choi, "Vibrotactile Score: A Score Metaphor for Designing Vibrotactile Patterns," Proc. IEEE World Haptics Conf., pp. 302-307, 2009.
[36] J. Lee and S. Choi, "Evaluation of Vibrotactile Pattern Design Using Vibrotactile Score," Proc. IEEE Haptics Symp., pp. 231-238, 2012.
[37] G. Evangelopoulos, K. Rapantzikos, P. Maragos, Y. Avrithis, and A. Potamianos, "Audiovisual Attention Modeling and Salient Event Detection," Multimodal Processing and Interaction, vol. 33, no. 2, pp. 1-21, 2008.
30 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool