Understanding How Adolescents with Autism Respond to Facial Expressions in Virtual Reality Environments
Issue No. 04 - April (2013 vol. 19)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TVCG.2013.42
E. Bekele , EECS Dept., Vanderbilt Univ., Nashville, TN, USA
Zhi Zheng , Electr. Eng. & Comput. Sci. Dept., Vanderbilt Univ., Nashville, TN, USA
A. Swanson , Treatment & Res. in Autism Disorders (TRIAD), Vanderbilt Univ., Nashville, TN, USA
J. Crittendon , Dept. of Pediatrics & Psychiatry, Vanderbilt Univ., Nashville, TN, USA
Z. Warren , Dept. of Pediatrics & Psychiatry, Vanderbilt Univ., Nashville, TN, USA
N. Sarkar , Mech. Eng. Dept., Vanderbilt Univ., Nashville, TN, USA
Autism Spectrum Disorders (ASD) are characterized by atypical patterns of behaviors and impairments in social communication. Among the fundamental social impairments in the ASD population are challenges in appropriately recognizing and responding to facial expressions. Traditional intervention approaches often require intensive support and well-trained therapists to address core deficits, with many with ASD having tremendous difficulty accessing such care due to lack of available trained therapists as well as intervention costs. As a result, emerging technology such as virtual reality (VR) has the potential to offer useful technology-enabled intervention systems. In this paper, an innovative VR-based facial emotional expression presentation system was developed that allows monitoring of eye gaze and physiological signals related to emotion identification to explore new efficient therapeutic paradigms. A usability study of this new system involving ten adolescents with ASD and ten typically developing adolescents as a control group was performed. The eye tracking and physiological data were analyzed to determine intragroup and intergroup variations of gaze and physiological patterns. Performance data, eye tracking indices and physiological features indicated that there were differences in the way adolescents with ASD process and recognize emotional faces compared to their typically developing peers. These results will be used in the future for an online adaptive VR-based multimodal social interaction system to improve emotion recognition abilities of individuals with ASD.
Variable speed drives, Biomedical monitoring, Monitoring, Animation, Emotion recognition, Physiology, Autism
E. Bekele, Zhi Zheng, A. Swanson, J. Crittendon, Z. Warren and N. Sarkar, "Understanding How Adolescents with Autism Respond to Facial Expressions in Virtual Reality Environments," in IEEE Transactions on Visualization & Computer Graphics, vol. 19, no. 4, pp. 711-720, 2013.