David A. Forsyth
2005 Technical Achievement Award
“For contributions to object recognition, tracking, and image-language analysis yielding deeper understanding of computer vision and its relationship to other disciplines”
David Forsyth holds a BSc and an MSc in Electrical Engineering from the University of Witwatersrand, Johannesburg, and an MA and D.Phil from Oxford University. He was a professor at U. Iowa for three years, at U.C. Berkeley for ten years, and is now a professor at U. Illinois Urbana-Champaign. Dr. Forsyth has invented and demonstrated a series of important methods for recognizing objects and activities in images in video. Early work established a still widely used method for determining the true color of an object viewed under colored light (IJCV90), demonstrated that shape from shading could not work (CVPR 89; PAMI 91), and demonstrated methods to measure properties of shape that are invariant under changes of view (PAMI 91; IJCV 96). One such method received the Marr prize for best paper at the International Conference on Computer Vision (ICCV 93). He is a major figure in the emerging field of human motion computing, which deals with both understanding and animating what people do. More recently, he and co-authors have demonstrated the first robust, accurate human tracker that can reliably report the configuration of arms and legs and does not need to be started by hand (CVPR 2003; CVPR 2005). This method applies to animals as well (ICCV 2003). We showed that motion capture data can be rearranged to produce highly realistic animations of novel human motions (SIGGRAPH 2002), and demonstrated that close control of the nature of the motion was possible using annotations (SIGGRAPH 2003). We then demonstrated that this animation system can be linked to the output of this tracker, to obtain annotations describing human activities automatically from video (NIPS 2003). He originated methods that exploit the remarkable interactions between pictures and the words that lie near them in almost any dataset.