The Community for Technology Leaders
Green Image
We propose a genetic-based expectation-maximization (GA-EM) algorithm for learning Gaussian mixture models from multivariate data. This algorithm is capable of selecting the number of components of the model using the minimum description length (MDL) criterion. Our approach benefits from the properties of Genetic algorithms (GA) and the EM algorithm by combination of both into a single procedure. The population-based stochastic search of the GA explores the search space more thoroughly than the EM method. Therefore, our algorithm enables escaping from local optimal solutions since the algorithm becomes less sensitive to its initialization. The GA-EM algorithm is elitist which maintains the monotonic convergence property of the EM algorithm. The experiments on simulated and real data show that the GA-EM outperforms the EM method since: 1) We have obtained a better MDL score while using exactly the same termination condition for both algorithms. 2) Our approach identifies the number of components which were used to generate the underlying data more often than the EM algorithm.
Index Terms- Unsupervised learning, clustering, Gaussian mixture models, EM algorithm, Genetic algorithm, minimum description length.

F. Pernkopf and D. Bouchaffra, "Genetic-Based EM Algorithm for Learning Gaussian Mixture Models," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 27, no. , pp. 1344-1348, 2005.
89 ms
(Ver 3.3 (11022016))