The Community for Technology Leaders
RSS Icon
Issue No.08 - August (2008 vol.30)
pp: 1496-1502
Jacob Goldberger , Bar-Ilan University, Ramt-Gan
Hayit K. Greenspan , Tel-Aviv University, Tel-Aviv
Jeremie Dreyfuss , Tel-Aviv University, Tel-Aviv
Mixture of Gaussians (MoG) model is a useful tool in statistical learning. In many learning processes that are based on mixture models, computational requirements are very demanding due to the large number of components involved in the model. We propose a novel algorithm for learning a simplified representation of a Gaussian mixture, that is based on the Unscented Transform which was introduced for filtering nonlinear dynamical systems. The superiority of the proposed method is validated on both simulation experiments and categorization of a real image database. The proposed categorization methodology is based on modeling each image using a Gaussian mixture model. A category model is obtained by learning a simplified mixture model from all the images in the category.
Image Classification, Unscented Transform
Jacob Goldberger, Hayit K. Greenspan, Jeremie Dreyfuss, "Simplifying Mixture Models Using the Unscented Transform", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.30, no. 8, pp. 1496-1502, August 2008, doi:10.1109/TPAMI.2008.100
[1] Y. Bar-Shalom and X. Li, Estimation and Tracking: Principles, Techniques and Software. Artech House, 1993.
[2] D. Comaniciu and P. Meer, “Mean Shift: A Robust Approach toward Feature Space Analysis,” IEEE Trans. Pattern Analysis and Machine Intelligence, pp. 603-619, 2002.
[3] J. Davis and I. Dhillon, “Differential Entropic Clustering of Multivariate Gaussians,” Proc. 20th Ann. Conf. Neural Information Processing Systems (NIPS), 2006.
[4] J. Goldberger, H. Greenspan, and J. Dreyfuss, “An Optimal Reduced Representation of a MoG with Applications to Medical Image Database Classification,” Proc. IEEE Conf. Computer Vision and Pattern Recognition (CVPR), 2007.
[5] J. Goldberger, H. Greenspan, and S. Gordon, “An Efficient Similarity Measure Based on Approximations of KL-Divergence between Two Gaussian Mixtures,” Proc. Ninth IEEE Int'l Conf. Computer Vision (ICCV), 2003.
[6] J. Goldberger, H. Greenspan, and S. Gordon, “Unsupervised Image-Set Clustering Using an Information Theoretic Framework,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 28, pp. 449-458, 2006.
[7] J. Goldberger and S. Roweis, “Hierarchical Clustering of Mixture Model,” Proc. 18th Ann. Conf. Neural Information Processing Systems (NIPS), 2004.
[8] H. Greenspan and A. Pinhas, “Medical Image Categorization and Retrieval for PACS Using the GMM-KL Framework,” IEEE Trans. Information Technology in Biomedicine, pp. 190-202, 2007.
[9] B. Han, D. Comaniciu, Y. Zhu, and L. Davis, “Incremental Density Approximation and Kernel-Based Bayesian Filtering for Object Tracking,” Proc. IEEE Conf. Computer Vision and Pattern Recognition (CVPR), 2004.
[10] F. Hu, “The Asymptotic Properties of Relevance Weighted Likelihood Estimations,” Canadian J. Statistics, pp. 45-60, 1997.
[11] S. Julier, “The Scaled Unscented Transformation,” Proc. Am. Control Conf. (ACC '02), pp. 4555-4559, 2002.
[12] S. Julier and J.K. Uhlmann, “Unscented Filtering and Nonlinear Estimation,” Proc. IEEE, pp. 401-422, 2004.
[13] S. Julier, J.K. Uhlmann, and H.F. Durrant-Whyte, “New Method for the Nonlinear Transformation of Means and Covariances in Filters and Estimators,” IEEE Trans. Automatic Control, pp. 477-482, 2000.
[14] B. Kurkoski and J. Dauwels, “Message-Passing Decoding of Lattices Using Gaussian Mixtures,” Proc. 30th Symp. Information Theory and Its Applications (SITA '07), pp. 877-882, 2007.
[15] T.M. Lehmann, M. Guld, T. Deselaers, D. Keysers, H. Schubert, K. Spitzer, H. Ney, and B.B. Wein, “Automatic Categorization of Medical Images for Content-Based Retrieval and Data Mining,” Computerized Medical Imaging and Graphics, pp. 143-155, 2005.
[16] T.M. Lehmann, M. Guld, O. Thies, B. Fisher, K. Spitzer, D. Keysers, H. Ney, M. Kohnen, H. Schubert, and B.B. Wein, “Content-Based Image Retrieval in Medical Applications,” Methods of Information in Medicine, pp. 354-361, 2004.
[17] R. Neal and G. Hinton, “A View of the EM Algorithm That Justifies Incremental, Sparse, and Other Variants,” Learning in Graphical Models, In M.I. Jordan, MIT Press, pp. 355-368, 1999.
[18] N. Petrovic, A. Ivanovic, N. Jojic, S. Basu, and T. Huang, “Recursive Estimation of Generative Models of Video,” Proc. IEEE Conf. Computer Vision and Pattern Recognition (CVPR), 2006.
[19] E.B. Sudderth, A. Torralbs, W.T. Freeman, and A.S. Willsky, “Describing Visual Scenes Using Transformed Dirichlet Processes,” Proc. 20th Ann. Conf. Neural Information Processing Systems (NIPS), 2006.
[20] N. Vasconcelos, “On the Complexity of Probabilistic Image Retrieval,” Proc. Eighth Int'l Conf. Computer Vision (ICCV), 2001.
[21] X. Wang, C. van Eeden, and J. Zidek, “Asymptotic Properties of Maximum Weighted Likelihood Estimators,” J. Statistical Planning and Inference, pp. 37-54, 2004.
[22] K. Zhang and J.T. Kwok, “Simplifying Mixture Models through Function Approximation,” Proc. 20th Ann. Conf. Neural Information Processing Systems (NIPS), 2006.
13 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool