The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.03 - May/June (2009 vol.15)
pp: 369-382
Aaron Kotranza , University of Florida, Gainesville
Benjamin Lok , University of Florida, Gainesville
Adeline Deladisma , Medical College of Georgia, Augusta
Carla M. Pugh , Northwestern University, Chicago
D. Scott Lind , Medical College of Georgia, Augusta
ABSTRACT
This paper presents Mixed Reality Humans (MRHs), a new type of embodied agent enabling touch-driven communication. Affording touch between human and agent allows MRHs to simulate interpersonal scenarios in which touch is crucial. Two studies provide initial evaluation of user behavior with a MRH patient and the usability and acceptability of a MRH patient for practice and evaluation of medical students' clinical skills. In Study I (n=8) it was observed that students treated MRHs as social actors more than students in prior interactions with virtual human patients (n=27), and used interpersonal touch to comfort and reassure the MRH patient similarly to prior interactions with human patients (n=76). In the within-subjects Study II (n=11), medical students performed a clinical breast exam on each of a MRH and human patient. Participants performed equivalent exams with the MRH and human patients, demonstrating the usability of MRHs to evaluate students' exam skills. The acceptability of the MRH patient for practicing exam skills was high as students rated the experience as believable and educationally beneficial. Acceptability was improved from Study I to Study II due to an increase in the MRH's visual realism, demonstrating that visual realism is critical for simulation of specific interpersonal scenarios.
INDEX TERMS
Intelligent agents, Virtual reality, Life and Medical Sciences, Artificial, augmented, and virtual realities
CITATION
Aaron Kotranza, Benjamin Lok, Adeline Deladisma, Carla M. Pugh, D. Scott Lind, "Mixed Reality Humans: Evaluating Behavior, Usability, and Acceptability", IEEE Transactions on Visualization & Computer Graphics, vol.15, no. 3, pp. 369-382, May/June 2009, doi:10.1109/TVCG.2008.195
REFERENCES
[1] R.H. Hill Jr., J. Gratch, S. Marsella, J. Rickel, W. Swartout, and D. Traum, “Virtual Humans in the Mission Rehearsal Exercise System,” Artificial Intelligence, vol. 17, 2003.
[2] W. Swartout, J. Gratch, R.W. Hill, E. Hovy, S. Marsella, J. Rickel, and D. Traum, “Toward Virtual Humans,” AI Magazine, vol. 27, no. 2, pp. 96-108.
[3] G. Frank, C. Guinn, R. Hubal, P. Pope, M. Stanford, and D. Lamm-Weisel, “Just-Talk: An Application of Responsive Virtual Human Technology,” Proc. Interservice/Industry Training, Simulation and Education Conf. (I/ITSEC '02), Dec. 2002.
[4] S. Babu, T. Barnes, and L. Hodges, “Can Immersive Virtual Humans Teach Social Conversational Protocols?” Proc. IEEE Int'l Conf. Virtual Reality (VR '07), pp. 215-218, Mar. 2007.
[5] E. Deaton, C. Barba, T. Santarelli, L. Rosenzweig, V. Souders, C. Mc-Collum, J. Seip, W. Knerr, and J. Singer, “Virtual Environment Cultural Training for Operational Readiness (Vector),” Virtual Reality, vol. 8, no. 3, pp. 156-167, 2005.
[6] A. Manganas et al., “The Just VR Tool: An Innovative Approach to Training Personnel for Emergency Situations Using Virtual Reality Techniques,” J. Information Technology in Healthcare, vol. 2, no. 6, pp. 399-412, 2004.
[7] K. Johnsen, R. Dickerson, A. Raij, C. Harrison, B. Lok, A. Stevens, and D.S. Lind, “Evolving an Immersive Medical Communication Skills Trainer,” Presence: Teleoperators and Virtual Environments, vol. 15, no. 1, pp. 33-46, Feb. 2006.
[8] A. Raij, K. Johnsen, R. Dickerson, B. Lok, M. Cohen, M. Duerson, R. Pauly, A. Stevens, P. Wagner, and D. Scott Lind, “Comparing Interpersonal Interactions with a Virtual Human to Those with a Real Human,” IEEE Trans. Visualization and Computer Graphics, vol. 13, no. 3, pp. 443-457, May/June 2007.
[9] F. Lang, R. McCord, L. Harvill, and D.S. Anderson, “Communication Assessment Using the Common Ground Instrument: Psychometric Properties,” Family Medicine, vol. 36, no. 3, pp. 189-198, 2004.
[10] A. Kotranza and B. Lok, “${\rm Virtual}\;{\rm Human} + {\rm Tangible}\;{\rm Interface} =\!\!$ ${\rm Mixed}\;{\rm Reality}\;{\rm Human}$ . An Initial Exploration with a Virtual Breast Exam Patient,” Proc. IEEE Int'l Conf. Virtual Reality (VR '08), pp.99-106, 2008.
[11] B. Shneiderman and C. Plaisant, Designing the User Interface: Strategies for Effective Human-Computer Interaction, fourth ed. Addison-Wesley, 2004.
[12] D.-P. Pertaub, M. Slater, and C. Barker, “An Experiment on Public Speaking Anxiety in Response to Three Different Types of Virtual Audience,” Presence: Teleoperators and Virtual Environments, vol. 11, no. 1, pp. 68-78, 2002.
[13] J.N. Bailenson, J. Blascovich, A.C. Beall, and J.M. Loomis, “Equilibrium Revisited: Mutual Gaze and Personal Space in Virtual Environments,” Presence: Teleoperators and Virtual Environments, vol. 10, pp. 583-598, 2001.
[14] C. Zanbaka, P. Goolkasian, and L.F. Hodges, “Can a Virtual Cat Persuade You? The Role of Gender and Realism in Speaker Persuasiveness,” Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI '06), pp. 1153-1162, 2006.
[15] K. Johnsen, A. Raij, A. Stevens, D.S. Lind, and B. Lok, “The Validity of a Virtual Human Experience for Interpersonal Skills Education,” Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI '07), pp. 1049-1058, 2007.
[16] J.D. Fisher, M. Rytting, and R. Heslin, “Hands Touching Hands: Affective and Evaluative Effects of an Interpersonal Touch,” Sociometry, vol. 39, no. 4, pp. 416-421, 1976.
[17] D.J. Dolin and M. Booth-Butterfield, “Reach Out and Touch Someone: Analysis of Nonverbal Comforting Responses,” Comm. Quarterly, vol. 41, no. 4, pp. 383-393, 1993.
[18] P. Ellsworth, H.S. Friedman, D. Perlick, and M. Hoyt, “Effects of Direct Gaze on Subjects Motivated to Seek or Avoid Social Comparison,” J. Experimental Social Psychology, vol. 14, pp. 69-87, 1978.
[19] J.G. Bruhn, “The Doctor's Touch: Tactile Communication in the Doctor-Patient Relationship,” Southern Medical J., vol. 71, no. 12, pp. 1473-1479, 1978.
[20] B. Insko, “Passive Haptics Significantly Enhances Virtual Environments,” PhD dissertation, Dept. Computer Science, UNC-Chapel Hill, 2001.
[21] C. Basdogan, C. Ho, M.A. Srinivasan, and M. Slater, “An Experimental Study on the Role of Touch in Shared Virtual Environments,” ACM Trans. Computer-Human Interaction, vol. 7, no. 4, pp. 443-460, Dec. 2000.
[22] J.N. Bailenson and N. Yee, “Virtual Interpersonal Touch: Haptic Interaction and Copresence in Collaborative Virtual Environments,” Int'l J. Multimedia Tools and Applications, vol. 37, no. 1, pp.5-14, 2007.
[23] R. Kneebone, J. Kidd, D. Nestel, S. Asvall, P. Paraskeva, and A. Darzi, “An Innovative Model for Teaching and Learning Clinical Procedures,” Medical Education, vol. 36, pp. 628-634, 2002.
[24] L.A. Wind, J. van Dalen, A.M.M. Muijtjens, and J.-J. Rethans, “Assessing Simulated Patients in an Educational Setting: The MaSP (Maastricht Assessment of Simulated Patients),” Medical Education, vol. 38, no. 1, pp. 39-44, 2004.
[25] M. Duerson, J. Cendan, J. Woodard, and M. Hammoud, “Integrating Anatomical Simulators with Standardized Patients in a Clinical Skills Examination,” Poster Presentation at the Sixth Ann. Conf. of the Assoc. of Standardized Patient Educators, 2007.
[26] S. Wellek, Testing Statistical Hypotheses of Equivalence. Chapman and Hall, Nov. 2002.
[27] D. Saslow et al., “Clinical Breast Examination: Practical Recommendations for Optimizing Performance and Reporting,” CA: Cancer J. Clinicians, vol. 54, pp. 327-344, 2004.
[28] D.C. Hall et al., “Improved Detection of Human Breast Lesions Following Experimental Training,” Cancer, vol. 46, pp. 408-414, 1980.
17 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool