This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Special Education and Rehabilitation: Teaching and Healing with Interactive Graphics
September/October 2005 (vol. 25 no. 5)
pp. 40-48
Barnab? Tak?cs, Digital Elite
A special education and rehabilitation system employs real-time interactive computer graphics and photorealistic virtual humans that use gaze and facial gestures as guides to focus the learner's attention. By incorporating appropriate emotional responses while providing both verbal and nonverbal feedback, the solution implements an emotional modulation technique that increases learning efficiency. The model is based on a closed-loop interaction paradigm in which the learner's internal state is continuously monitored by sensors, and the responses as well as strategies of the animated tutoring system are adjusted accordingly. Sensors include a video camera for facial tracking and facial expression analysis, an eye tracker for measuring gaze, a biofeedback device to gauge stress levels, and other input/output devices for VR. The system has been used to develop many novel practical solutions. Case studies include applications for autistic children, cybertherapy, and cognitive rehabilitation.

1. J. Rickel et al., "Toward a New Generation of Virtual Humans for Interactive Experiences," IEEE Intelligent Systems, vol. 17, no. 4, 2002, pp. 32-38.
2. B. Takács and B. Kiss, "Virtual Human Interface: A Photorealistic Digital Human," IEEE Computer Graphics and Applications, vol. 23, no. 5, 2003, pp. 38-45.
3. B. Kort, R. Reilly, and R.W. Picard, "An Affective Model of Interplay between Emotions and Learning: Reengineering Educational Pedagogy— Building a Learning Companion," Proc. Int'l Conf. Advanced Learning Technologies, 2001; http://affect.media.mit.edu/projectpages/ lcicalt.pdf.
4. R. Ramloll et al., "A Gaze Contingent Environment for Fostering Social Attention in Autistic Children," Proc. Eye Tracking Research and Applications Symp. Eye Tracking Research and Applications, 2004, pp. 19-26; http://s91149240.onlinehome.us/rampub/Pub autistHelicop.pdf.
5. C. Sikne Lányi et al., "Developing Multimedia Software and Virtual Reality Worlds and their Use in Rehabilitation and Psychology," Proc. 2nd Int'l Conf. E-Health in Common Europe, IOS Press, 2004.
6. G. Csukly et al., "Evaluating Psychiatric Patients Using High Fidelity Animated Faces," Proc. CyberTherapy, 2004.
1. N. Badler, M.S. Palmer, and R. Bindiganavale, "Animation Control for Real-Time Virtual Humans," Comm. ACM, vol. 42, no. 8, 1999.
2. J. Rickel et al., "Toward a New Generation of Virtual Humans for Interactive Experiences," IEEE Intelligent Systems, vol. 17, no. 4, 2002, pp. 32-38.
3. B. Takács and B. Kiss, "Virtual Human Interface: A Photorealistic Digital Human," IEEE Computer Graphics and Applications, vol. 23, no. 5, 2003, pp. 38-45.
4. J. Cassell et al., "More than just a Pretty Face: Affordances of Embodiment," Proc. Intelligent User Interfaces (IUI), 2000; http://www.isi.edu/~hannes/publicationsIUI2000.pdf .
5. T. Bickmore, L. Caruso, and K. Clough-Gorr, "Acceptance and Usability of a Relational Agent Interface by Urban Older Adults," Proc. Conf. Computer-Human Interaction (CHI), ACM Press, 2005.
6. J. Lester, B. Stone, and G. Stelling, "Lifelike Pedagogical Agents for Mixed-Initiative Problem Solving in Constructivist Learning Environments," User Modeling and User-Adapted Interaction, vol. 9, nos. 1-2, 1999, pp. 1-44.
7. T. Koda and P. Maes, "Agents with Faces: The Effect of Personification," Proc. 5th IEEE Int'l Workshop Robot and Human Comm. (ROMAN), 1996; http://ai.rightnow.com/colloquium/papers Agents-with-faces.pdf.
8. A. Takeuchi and T. Naito, "Situated Facial Displays: Towards Social Interaction," Proc. Conf. Computer-Human Interaction (CHI), ACM Press, 1995.
9. S. Kiesler and L. Sproull, "'Social' Human-Computer Interaction," Human Values and the Design of Computer Technology, B. Friedman, ed., CSLI Publications, 1997, pp. 191-199.
10. J. Cassell, "Embodied Conversation: Integrating Face and Gesture into Automatic Spoken Dialogue Systems," Spoken Dialogue Systems, S. Luperfoy, ed., MIT Press, 1999.
11. D.M. Dehn and S.V. Mulken, "The Impact of Animated Interface Agents: A Review of Empirical Research," Int'l J. Human-Computer Studies, vol. 52, 2000, pp. 1-22.
12. J.H.G. Williams et al., "Visual-Auditory Integration during Speech Imitation in Autism," Research in Developmental Disabilities, vol. 25, pp. 559-575.
13. C. Sikne Lányi et al., "Developing Multimedia Software and Virtual Reality Worlds and their Use in Rehabilitation and Psychology," Proc. 2nd Int'l Conf. E-Health in Common Europe, IOS Press, 2004.
14. W.L. Johnson, J.W. Rickel, and J.C. Lester, "Animated Pedagogical Agents: Face-to-Face Interaction in Interactive Learning Environments," Int'l J. Artificial Intelligence in Education, vol. 11, 2000, pp. 47-78.

Index Terms:
ACE, virtual-human interface, rehabilitation, human-computer interaction (HCI), real-time animation, vision and perception, behavior control
Citation:
Barnab? Tak?cs, "Special Education and Rehabilitation: Teaching and Healing with Interactive Graphics," IEEE Computer Graphics and Applications, vol. 25, no. 5, pp. 40-48, Sept.-Oct. 2005, doi:10.1109/MCG.2005.113
Usage of this product signifies your acceptance of the Terms of Use.