The Community for Technology Leaders
RSS Icon
Issue No.02 - April-June (2011 vol.18)
pp: 26-37
Dimitrios Tzovaras , Informatics and Telematics Institute Institute, Centre for Research and Technology Hellas
Oya Aran , Idiap Research Institute
<p>Using sign language, speech, and haptics as communication modalities, a virtual treasure-hunting game serves as an entertainment and educational tool for visually- and hearing-impaired users.</p>
multimodal interfaces, virtual reality, haptics, sign language, usability evaluation, accessible games, cross-modal transformation, multimedia
Dimitrios Tzovaras, Laila Dybkjaer, Niels Ole Bernsen, Oya Aran, "Using Modality Replacement to Facilitate Communication between Visually and Hearing-Impaired People", IEEE MultiMedia, vol.18, no. 2, pp. 26-37, April-June 2011, doi:10.1109/MMUL.2010.22
1. J. Lumsden and S.A. Brewster, "A Paradigm Shift: Alternative Interaction Techniques for Use with Mobile & Wearable Devices," Proc. 13th Ann. IBM Centers for Advanced Studies Conf., IBM Press, 2003, pp. 97-100.
2. O. Lahav and D. Mioduser, "Exploration of Unknown Spaces by People Who Are Blind, Using a Multisensory Virtual Environment (MVE)," J. Special Education Technology, vol. 19, no. 3, 2004, pp. 15-24.
3. T. Pun et al., "Image and Video Processing for Visually Handicapped People," Eurasip J. Image and Video Processing, vol. 2007, article ID 25214, 2007.
4. A. Caplier et al., "Image and Video for Hearing Impaired People," Eurasip J. Image and Video Processing, vol. 2007, article ID 45641, 2007.
5. S.C.W. Ong and S. Ranganath, "Automatic Sign Language Analysis: A Survey and the Future beyond Lexical Meaning," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 27, no. 6, 2005, pp. 873-891.
6. N. Bourbakis, A. Exposito, and D. Kobraki, "Multi-Modal Interfaces for Interaction-Communication Between Hearing and Visually Impaired Individuals: Problems and Issues," Proc. 19th IEEE Int'l Conf. Tools with Artificial Intelligence, IEEE Press, 2007, pp. 522-530.
7. D. Tzovaras et al., "Design and Implementation of Haptic Virtual Environments for the Training of Visually Impaired," IEEE Trans. Neural Systems and Rehabilitation Eng., vol. 12, no. 2, 2004, pp. 266-278.
8. H. Petrie et al., "Universal Interfaces to Multimedia Documents," Proc. 4th IEEE Int'l Conf. Multimodal Interfaces, IEEE CS Press, 2002, pp. 319-324.
9. N.O. Bernsen and L. Dybkjær, Multimodal Usability, Human–Computer Interaction Series, Springer, 2010.
10. O. Aran et al., "Signtutor: An Interactive System for Sign Language Tutoring," IEEE MultiMedia, vol. 16, no. 1, 2009, pp. 81-93.
11. G. Bologna et al., "Transforming 3D Coloured Pixels into Musical Instrument Notes for Vision Substitution Applications," Eurasip J. Image and Video Processing, special issue on image and video processing for disability, vol. 2007, article ID 76204, 2007.
12. M. Papadogiorgaki et al., "Gesture Synthesis from Sign Language Notation Using MPEG-4 Humanoid Animation Parameters and Inverse Kinematics," Proc. 2nd Int'l Conf. Intelligent Environments, IET, 2006.
13. G.C. Burdea, Force and Touch Feedback for Virtual Reality, Wiley-Interscience, 1996.
14. K. Moustakas, D. Tzovaras, and M.G. Strintzis, "Sq-map: Efficient Layered Collision Detection and Haptic Rendering," IEEE Trans. Visualization and Computer Graphics, vol. 13, no. 1, 2007, pp. 80-93.
15. K. Moustakas et al., "Haptic Rendering of Visual Data for the Visually Impaired," IEEE MultiMedia, vol. 14, no. 1, 2007, pp. 62-72.
16. R. Ramloll et al., "Constructing Sonified Haptic Line Graphs for the Blind Student: First Steps," Proc. ACM Conf. Assistive Technologies, ACM Press, 2000.
17. S. Young, The HTK Hidden Markov Model Toolkit: Design and Philosophy, tech. report CUED/ F-INFENG/TR152, Cambridge Univ. Engineering Dept., Sept. 1994.
18. B. Bozkurt et al. "Improving Quality of Mbrola Synthesis for Non-Uniform Units Synthesis," Proc. IEEE Workshop Speech Synthesis, IEEE Press, 2002, pp. 7-10.
15 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool