The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.03 - March (2002 vol.24)
pp: 381-396
ABSTRACT
<p>This paper proposes an unsupervised algorithm for learning a finite mixture model from multivariate data. The adjective “unsupervised” is justified by two properties of the algorithm: 1) it is capable of selecting the number of components and 2) unlike the standard <it>expectation-maximization</it> (EM) algorithm, it does not require careful initialization. The proposed method also avoids another drawback of EM for mixture fitting: the possibility of convergence toward a singular estimate at the boundary of the parameter space. The novelty of our approach is that we do not use a model selection criterion to choose one among a set of preestimated candidate models; instead, we seamlessly integrate estimation and model selection in a single algorithm. Our technique can be applied to any type of parametric mixture model for which it is possible to write an EM algorithm; in this paper, we illustrate it with experiments involving Gaussian mixtures. These experiments testify for the good performance of our approach.</p>
INDEX TERMS
finite mixtures, unsupervised learning, model selection, minimum message length criterion, Bayesian methods, expectation-maximization algorithm, clustering
CITATION
M.A.T. Figueiredo, A.K. Jain, "Unsupervised Learning of Finite Mixture Models", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.24, no. 3, pp. 381-396, March 2002, doi:10.1109/34.990138
30 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool