This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Hand-Gesture Computing for the Hearing and Speech Impaired
April-June 2008 (vol. 15 no. 2)
pp. 20-27
Gaurav Pradhan, University of Texas at Dallas
Balakrishnan Prabhakaran, University of Texas at Dallas
Chuanjun Li, Brown University
An instrumented data glove with a wireless interface provides convenient and natural human-computer interaction for people with speech or hearing impairments.

1. F.K.H. Quek, "Eyes in the Interface," Image and Vision Computing, vol. 12, no. 6, Aug. 1995, pp. 511-525.
2. V.I. Pavlovic, R. Sharma, and T.S. Huang, "Visual Interpretation of Hand Gestures for Human–Computer Interaction: A Review," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 19, no. 7, 1997, pp. 677-695.
3. C. Li, S.Q. Zheng, and B. Prabhakaran, "Segmentation and Recognition of Motion Streams by Similarity Search," ACM Trans. Multimedia Computing, Communications and Applications, 2007, vol. 3, no. 3.
4. K. Kahol, P. Tripathi, and S. Panchanathan, "Automated Segmentation of Gestures from Dance Sequences," Proc. 6th IEEE Int'l Conf. Face and Gesture Recognition, IEEE CS Press, 2004, pp. 883-888.
5. J. Barbic et al., "Segmenting Motion Capture Data into Distinct Behaviors," Proc. Graphics Interface, Canadian Human–Computer Comm. Soc., 2004, pp. 185-194.
6. C. Li and B. Prabhakaran, "Indexing of Motion Capture Data for Efficient and Fast Similarity Search," J. Computers (JCP), vol. 1, no. 3, 2006, pp. 35-42.
7. K. Yang and C. Shahabi, "A PCA-Based Similarity Measure for Multivariate Time Series," Proc. 2nd ACM Int'l Workshop Multimedia Databases, ACM Press, 2004, pp. 65-74.
8. W. Kadous, "Machine Recognition of Auslan Signs with PowerGloves: Towards Large-Lexicon Recognition of Sign Language," Proc. Workshop Integration of Gesture in Language and Speech, ACM Press, 1996, pp. 165-174.
9. J. Lee and T.L. Kunii, "Model-Based Analysis of Hand Posture," IEEE Computer Graphics and Applications, vol. 15, no. 5, 1995, pp. 77-86.
10. R.S. Hartenberg and J. Denavit, "A Kinematic Notation for Lower Pair Mechanisms Based on Matrices," J. Applied Mechanics, vol. 23, no. 2, 1955, pp. 215-221.
1. T.E. Starner and A. Pentland, "Visual Recognition of American Sign Language Using Hidden Markov Models," Proc. Intl'l Workshop Automatic Face and Gesture Recognition, IEEE Press, 1995, pp. 189-194.
2. S. Lenman, L. Bretzner, and B. Thuresson, Computer Vision Based Hand Gesture Interfaces for Human–Computer Interaction, tech. report Tritana-D0209, Centre for User Oriented IT Design, June 2002.
3. T. Takahashi and F. Kishino, "Hand Gesture Coding Based on Experiments Using a Hand Gesture Interface Device," Sigchi Bulletin, vol. 23, no. 2, 1991, pp. 67-74.
4. J. Kramer and L. Leifer, The Talking Glove: An Expressive and Receptive "Verbal" Communication Aid for the Deaf, Deaf-Blind, and Non-Vocal, tech. report, Dept. of Electrical Engineering, Stanford Univ., 1989.
5. G. ElKoura and Karan Singh, "Handrix: Animating the Human Hand," Proc. Siggraph Symp. Computer Animation, Eurographics Assoc., 2003, pp. 110-119.

Index Terms:
human-computer interaction, hand gesture, gesture recognition, motion segmentation, indexing, sign language recognition, similarity measure, singular value decomposition
Citation:
Gaurav Pradhan, Balakrishnan Prabhakaran, Chuanjun Li, "Hand-Gesture Computing for the Hearing and Speech Impaired," IEEE Multimedia, vol. 15, no. 2, pp. 20-27, April-June 2008, doi:10.1109/MMUL.2008.28
Usage of this product signifies your acceptance of the Terms of Use.