The Community for Technology Leaders
RSS Icon
Issue No.05 - September/October (2003 vol.23)
pp: 38-45
Barnab? Tak?cs , Digital Elite
Bernadette Kiss , VerAnim
<p>This article presents a multimodal human-computer interface that employs photorealistic virtual humans that can talk, emote, and act adaptively and intelligently in response to the actions of a user in front of a computer screen. The authors implemented this virtual human interface (VHI) system in a high-performance real-time, visual environment. This article describes novel 3D facial modeling and animation techniques used to design virtual faces capable of delivering fine details of metacommunication and supporting verbal content. It also introduces an artificial expression space representation to control the emotions and expressions of the animated digital human characters. The article describes the use of intelligent sensory mechanisms-such as vision, hearing, and touching-and presents special purpose sensory modules-such as face recognition and expression analysis. Experimental results of the VHI environment and practical application areas, including the first holographic virtual human, are discussed.</p>
virtual human, autonomous agents, human-computer interaction (HCI), real-time animation, vision and perception, behavior control, user interfaces, face tracking and recognition.
Barnab? Tak?cs, Bernadette Kiss, "The Virtual Human Interface: A Photorealistic Digital Human", IEEE Computer Graphics and Applications, vol.23, no. 5, pp. 38-45, September/October 2003, doi:10.1109/MCG.2003.1231176
1. N. Badler, M. Palmer, and R. Bindiganavale, "Animation Control for Real-Time Virtual Humans," Comm. ACM, vol. 42, no. 8, Aug. 1999, pp. 64-73.
2. D. Terzopoulos and K. Waters, "Techniques for Realistic Facial Modeling and Animation," Computer Animation, M. Thalmann and D. Thalmann, eds., Springer-Verlag, 1991, pp. 59-74.
3. S. Pasquariello and C. Pelachaud, "Greta: A Simple Facial Animation Engine," Proc. 6th Online World Conf. Soft Computing in Industrial Applications, Springer-Verlag, 2001.
4. S. Morishima, "Face Analysis and Synthesis," IEEE Signal Processing, vol. 18, no. 3, 2001, pp. 26-34.
5. J. Cassell, H. Vilhjálmsson, and T. Bickmore, "BEAT: The Behavior Expression Animation Toolkit," to be published in Proc. SIGGRAPH '01, ACM Press, New York, 2001.
6. I. Poggi, C. Pelachaud, and F. de Rosis, "Eye Communication in a Conversational 3D Synthetic Agent," J. Artificial Intelligence, vol. 13, no. 3, 2000, pp. 169-181.
7. A.A. Rizzo, "Virtual Reality and Disability: Emergence and Challenge," Disability and Rehabilitation, vol. 24, no. 11, 2002, pp. 567-569.
8. J.S. Rickel et al., "Toward a New Generation of Virtual Humans for Interactive Experiences," IEEE Intelligent Systems, vol. 17, no. 4, 2002, pp. 32-38.
9. W.L. Johnson, W. Rickel, and J.C. Lester, "Animated Pedagogical Agents: Face-to-Face Interaction in Interactive Learning Environments," Int'l J. Artificial Intelligence in Education, vol. 11, 2000, pp. 47-78.
10. D. Goleman, Emotional Intelligence: Why It Can Matter More Than IQ, Bantam Books, 1995.
11. H. Schlossberg, "The Description of Facial Expressions in Terms of Two Dimensions," J. Experimental Psychology, vol. 44, no. 4, 1952, pp. 229-237.
12. B. Kort and R. Reilly, "Analytical Models of Emotions, Learning and Relationships: Towards and Affect-Sensitive Cognitive Machine," Proc. Intelligent Tutoring Systems Conf., 2002, pp. 955-962.
13. R. Adolphs, "Neural Mechanisms for Recognizing Emotion," Current Opinion in Neurobiology, vol. 12, 2002, pp. 169-178.
14. A. Ortony, G. Clore, and A. Collins, The Cognitive Structure of Emotions, Cambridge Univ. Press, 1988.
15. G. Ball and J. Breese, "Emotion and Personality in a Conversational Character," Embodied Conversational Agents, MIT Press, Cambridge, Mass., 2000, pp. 189-219.
18 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool