This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Bayesian Feature and Model Selection for Gaussian Mixture Models
June 2006 (vol. 28 no. 6)
pp. 1013-1018
We present a Bayesian method for mixture model training that simultaneously treats the feature selection and the model selection problem. The method is based on the integration of a mixture model formulation that takes into account the saliency of the features and a Bayesian approach to mixture learning that can be used to estimate the number of mixture components. The proposed learning algorithm follows the variational framework and can simultaneously optimize over the number of components, the saliency of the features, and the parameters of the mixture model. Experimental results using high-dimensional artificial and real data illustrate the effectiveness of the method.

[1] B.G. McLachlan and D. Peel, Finite Mixture Models. Wiley, 2000.
[2] P. Carbonetto, N. de Freitas, P. Gustafson, and N. Thompson, “Bayesian Feature Weighting for Unsupervised Learning, with Application to Object Recognition,” Proc. Ninth Int'l Conf. Artificial Intelligence and Statistics, 2003.
[3] M.H. Law, M.A.T. Figueiredo, and A.K. Jain, “Simultaneous Feature Selection and Clustering Using a Mixture Model,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 26, no. 9, pp. 1154-1166, Sept. 2004.
[4] J. Dy and C. Brodley, “Feature Selection for Unsupervised Learning,” J. Machine Learning Research, vol. 5, pp. 845-889, 2004.
[5] R.M. Neal and G.E. Hinton, “A View of the EM Algorithm that Justifies Incremental, Sparse, and Other Variants,” Learning in Graphical Models, M.I. Jordan, ed., pp. 355-368, Kluwer, 1998.
[6] H. Attias, “A Variational Bayesian Framework for Graphical Models,” Advances in Neural Information Processing Systems 12, MIT Press, 2000.
[7] A. Corduneanu and C.M. Bishop, “Variational Bayesian Model Selection for Mixture Distributions,” Proc. Eighth Int'l Conf. Artificial Intelligence and Statistics, T. Richardson and T. Jaakkola, eds., pp. 27-34, Morgan Kaufmann, 2001.
[8] J.S. Liu, J.L. Zhang, M.J. Palumbo, and C.E. Lawrence, “Bayesian Clustering with Variable and Transformation Selections,” Bayesian Statistics, vol. 7, pp. 249-276, 2003.
[9] J.H. Friedman and J.J. Meulman, “Clustering Objects on Subsets of Attributes,” J. Royal Statistical Soc., vol. 66, no. 4, pp. 815-849, 2004.
[10] P.D. Hoff, “Model-Based Subspace Clustering,” Bayesian Analysis, vol. 1, no. 2, pp. 321-344, 2006.
[11] A.K. Jain, R. Duin, and J. Mao, “Statistical Pattern Recognition: A Review,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 1, pp. 4-38, Jan. 2000.
[12] C.L. Blake and C.J. Merz, “UCI Repository of Machine Learning Databases,” 1998, http://dx.doi.org/10.1109/10.1109/34.87344http:/ /www.ics.uci.edu/mlearnMLRepository.html .

Index Terms:
Mixture models, feature selection, model selection, Bayesian approach, variational training.
Citation:
Constantinos Constantinopoulos, Michalis K. Titsias, Aristidis Likas, "Bayesian Feature and Model Selection for Gaussian Mixture Models," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 6, pp. 1013-1018, June 2006, doi:10.1109/TPAMI.2006.111
Usage of this product signifies your acceptance of the Terms of Use.