Empirical Evaluation of Virtual Human Conversational and Affective Animations on Visual Attention in Inter-Personal Simulations
2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (2018)
March 18, 2018 to March 22, 2018
Matias Volonte , Human Centered Computing Lab, Clemson University
Andrew Robb , Human Centered Computing Lab, Clemson University
Andrew T. Duchowski , Human Centered Computing Lab, Clemson University
Sabarish V. Babu , Human Centered Computing Lab, Clemson University
Creating realistic animations of virtual humans remains comparatively complex and expensive. This research explores the degree to which animation fidelity affects users' gaze behavior when interacting in virtual reality training simulations that include virtual humans. Participants were randomly assigned to one of three conditions, wherein the virtual patient either: 1) was not animated; 2) played idle animations; or 3) played idle animations, looked at the participant when speaking, and lip-synced speech and facial gestures when conversing with the participant. Each participant's gaze was recorded in an inter-personal interactive patient surveillance simulation. Results suggest that conversational and passive animations elicited visual attention in a similar manner, as compared to the no animation condition. Results also suggest that when participants face critical situations in inter-personal medical simulations, visual attention towards the virtual human decreases while gaze towards goal directed activities increases.
Animation, Visualization, Solid modeling, Training, Virtual reality, Task analysis, Surveillance
M. Volonte, A. Robb, A. T. Duchowski and S. V. Babu, "Empirical Evaluation of Virtual Human Conversational and Affective Animations on Visual Attention in Inter-Personal Simulations," 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Tuebingen/Reutlingen, Germany, 2018, pp. 25-32.