This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Automated Eye Motion Using Texture Synthesis
March/April 2005 (vol. 25 no. 2)
pp. 24-30
Zhigang Deng, University of Southern California
J.P. Lewis, University of Southern California
Ulrich Neumann, University of Southern California
Modeling and animating human eyes requires special care, because, as the "windows to the soul", the eyes are particularly scrutinized by human observers. Our goal in this article is to simultaneously synthesize realistic eye gaze and blink motion, accounting for any possible correlations between the two. This problem of synthesizing signals that appear similar (but not identical) to a given sample is essentially the same problem as texture synthesis, but in a one-dimensional (vector) context. We demonstrate that texture synthesis methods can be applied to this animation problem, providing an effective method for capturing both perceptible movement and blink statistics, and any correlations between them. The resulting method is simple to implement yet produces life-like and lively eye motion for applications where automated movement (for example, for game characters) or voiceless eye motions (such as listening avatars) is a requirement.

1. S.P. Lee, J.B. Badler, and N. Badler, "Eyes Alive," ACM Trans. Graphics, ACM Press, vol. 21, no. 3, 2002, pp. 637-644.
2. D.D. Salvucci and J.H. Goldberg, "Identifying Fixations and Saccades in Eye-Tracking Protocols," Proc. Symp. Eye Tracking Research & Applications (ETRA), ACM Press, Nov. 2000, pp. 71-78.
3. L. Liang et al., "Real-Time Texture Synthesis by Patch-Based Sampling," ACM Trans. Graphics, ACM Press, vol. 20, no. 3, 2001, pp. 127-150.
1. J. Cassell et al., "Animated Conversation: Ruled-Based Generation of Facial Expression Gesture and Spoken Intonation for Multiple Conversational Agents," Proc. Siggraph, ACM Press, 1994, pp. 413-420.
2. J. Cassell, H. Vilhjalmsson, and T. Bickmore, "BEAT: The Behavior Expression Animation Toolkit," Proc. Siggraph, ACM Press, 2001, pp. 477-486.
3. S. Chopra-Khullar and N. Badler, "Where to Look? Automating Visual Attending Behaviors of Virtual Human Characters," Proc. 3rd ACM Conf. Autonomous Agents, ACM Press, 1999, pp. 16-23.
4. R. Vertegaal, G.V. Derveer, and H. Vons, "Effects of Gaze on Multiparty Mediated Communication," Proc. Graphics Interface, Morgan Kaufmann, 2000, pp. 95-102.
5. R. Vertegaal et al., "Eye Gaze Patterns in Conversations: There is More to Conversational Agents than Meets the Eyes," Proc. ACM CHI Conf. Human Factors in Computing Systems, ACM Press, 2001, pp. 301-308.
6. S.P. Lee, J.B. Badler, and N. Badler, "Eyes Alive," ACM Trans. Graphics, vol. 21, no. 3, 2002, pp. 637-644.
7. O. Arikan and D. Forsythe, "Interactive Motion Generation from Examples," ACM Trans. Graphics, ACM Press, vol. 21, no. 3, 2002, pp. 483-490.
8. L. Kovar et al., "Motion Graphs," ACM Trans. Graphics, ACM Press, vol. 21, no. 3, 2002, pp. 473-482.
9. Y. Li, T. Wang, and H.-Y. Shum, "Motion Texture: A Two-Level Statistical Model for Character Motion Synthesis," ACM Trans. Graphics, ACM Press, vol. 21, no. 3, 2002, pp. 465-472.
10. A. Efros and T.K. Leung, "Texture Synthesis by Non-Parametric Sampling," Proc. Int'l Conf. Computer Vision (ICCV), IEEE CS Press, 1999, pp. 1033-1038.
11. L. Liang et al., "Real-Time Texture Synthesis by Patch-Based Sampling," ACM Trans. Graphics, 2001, vol. 20, no. 3, pp. 127-150.

Index Terms:
Computer graphics, facial animation, eye motion
Citation:
Zhigang Deng, J.P. Lewis, Ulrich Neumann, "Automated Eye Motion Using Texture Synthesis," IEEE Computer Graphics and Applications, vol. 25, no. 2, pp. 24-30, March-April 2005, doi:10.1109/MCG.2005.35
Usage of this product signifies your acceptance of the Terms of Use.