The Community for Technology Leaders
Green Image
ABSTRACT
In spite of the initialization problem, the Expectation-Maximization (EM) algorithm is widely used for estimating the parameters of finite mixture models. Most popular model-based clustering techniques might yield poor clusters if the parameters are not initialized properly. To reduce the sensitivity of initial points, a novel algorithm for learning mixture models from multivariate data is introduced in this paper. The proposed algorithm takes advantage of TRUST-TECH (TRansformation Under STability-reTaining Equilibra CHaracterization) to compute neighborhood local maxima on likelihood surface using stability regions. Basically, our method coalesces the advantages of the traditional EM with that of the dynamic and geometric characteristics of the stability regions of the corresponding nonlinear dynamical system of the log-likelihood function. Two phases namely, the EM phase and the stability region phase, are repeated alternatively in the parameter space to achieve improvements in the maximum likelihood. The EM phase obtains the local maximum of the likelihood function and the stability region phase helps to escape out of the local maximum by moving towards the neighboring stability regions. The algorithm has been tested on both synthetic and real datasets and the improvements in the performance compared to other approaches are demonstrated. The robustness with respect to initialization is also illustrated experimentally.
INDEX TERMS
expectation maximization, unsupervised learning, finite mixture models, dynamical systems, stability regions, model-based clustering.
CITATION
Bala Rajaratnam, Chandan K. Reddy, Hsiao-Dong Chiang, "TRUST-TECH-Based Expectation Maximization for Learning Finite Mixture Models", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 30, no. , pp. 1146-1157, July 2008, doi:10.1109/TPAMI.2007.70775
91 ms
(Ver )