This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Human Motion Tracking by Registering an Articulated Surface to 3D Points and Normals
January 2009 (vol. 31 no. 1)
pp. 158-163
Radu Horaud, INRIA Grenoble-Rhone-Alpes, Montbonnot Saint-Martin
Matti Niskanen, University of Oulu, Oulu
Guillaume Dewaele, INRIA Grenoble-Rhone-Alpes, Montbonnot Saint-Martin
Edmond Boyer, INRIA Grenoble-Rhone-Alpes, Montbonnot Saint-Martin
We address the problem of human motion tracking by registering a surface to 3-D data. We propose a method that iteratively computes two things: Maximum likelihood estimates for both the kinematic and free-motion parameters of an articulated object, as well as probabilities that the data are assigned either to an object part, or to an outlier cluster. We introduce a new metric between observed points and normals on one side, and a parameterized surface on the other side, the latter being defined as a blending over a set of ellipsoids. We claim that this metric is well suited when one deals with either visual-hull or visual-shape observations. We illustrate the method by tracking human motions using sparse visual-shape data (3-D surface points and normals) gathered from imperfect silhouettes.

[1] C.M. Bishop, Pattern Recognition and Machine Learning. Springer, 2006.
[2] E. Boyer and J.-S. Franco, “A Hybrid Approach for Computing Visual Hulls of Complex Objects,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, pp. 695-701, June 2003.
[3] G.J. Brostow, I.A. Essa, D. Steedly, and V. Kwatra, “Novel Skeletal Representation for Articulated Creatures,” Proc. Eighth European Conf. Computer Vision, vol. 3, pp. 66-78, 2004.
[4] K.-M. Cheung, S. Baker, and T. Kanade, “Shape-from-Silhouette across Time. Part II: Applications to Human Modeling and Markerless Motion,” Int'l J. Computer Vision, vol. 63, no. 3, pp. 225-245, 2005.
[5] D. Demirdjian, “Combining Geometric- and View-Based Approaches for Articulated Pose Estimation,” Proc. Eighth European Conf. Computer Vision, vol. 3, pp. 183-194, 2004.
[6] G. Dewaele, F. Devernay, R. Horaud, and F. Forbes, “The Alignment between 3-D Data and Articulated Shapes with Bending Surfaces,” Proc. Ninth European Conf. Computer Vision, vol. 3, pp. 578-591, May 2006.
[7] C. Fraley and A.E. Raftery, “Model-Based Clustering, Discriminant Analysis, and Density Estimation,” J. Am. Statistical Assoc., vol. 97, pp.611-631, 2002.
[8] J.-S. Franco, M. Lapierre, and E. Boyer, “Visual Shapes of Silhouette Sets,” Proc. Third Int'l Symp. 3D Data Processing, Visualization, and Transmission, 2006.
[9] S. Ilic, M. Salzmann, and P. Fua, “Implicit Meshes for Effective Silhouette Handling,” Int'l J. Computer Vision, vol. 72, no. 2, pp. 159-178, 2007.
[10] D. Knossow, R. Ronfard, and R. Horaud, “Human Motion Tracking with a Kinematic Parameterization of Extremal Contours,” Int'l J. Computer Vision, vol. 79, no. 2, pp. 247-269, Sept. 2008.
[11] B. Luo and E.R. Hancock, “Structural Graph Matching Using the EM Algorithm and Singular Value Decomposition,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 23, no. 11, pp. 1120-1136, Oct. 2001.
[12] R. Plänkers and P. Fua, “Articulated Soft Objects for Multi-View Shape and Motion Capture,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 25, no. 9, pp. 1182-1187, Sept. 2003.
[13] J. Starck and A. Hilton, “Model-Based Multiple View Reconstruction of People,” Proc. Ninth IEEE Int'l Conf. Computer Vision, Oct. 2003.
[14] W.M. Wells III, “Statistical Approaches to Feature-Based Object Recognition,” Int'l J. Computer Vision, vol. 28, nos. 1/2, pp. 63-98, 1997.

Index Terms:
Computer vision, Face and gesture recognition
Citation:
Radu Horaud, Matti Niskanen, Guillaume Dewaele, Edmond Boyer, "Human Motion Tracking by Registering an Articulated Surface to 3D Points and Normals," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, no. 1, pp. 158-163, Jan. 2009, doi:10.1109/TPAMI.2008.108
Usage of this product signifies your acceptance of the Terms of Use.