Issue No. 04 - April (2012 vol. 34)
Ryan P. Browne , University of Guelph, Guelph
Paul D. McNicholas , University of Guelph, Guelph
Matthew D. Sparling , University of Guelph, Guelph
We introduce a mixture model whereby each mixture component is itself a mixture of a multivariate Gaussian distribution and a multivariate uniform distribution. Although this model could be used for model-based clustering (model-based unsupervised learning) or model-based classification (model-based semi-supervised learning), we focus on the more general model-based classification framework. In this setting, we fit our mixture models to data where some of the observations have known group memberships and the goal is to predict the memberships of observations with unknown labels. We also present a density estimation example. A generalized expectation-maximization algorithm is used to estimate the parameters and thereby give classifications in this mixture of mixtures model. To simplify the model and the associated parameter estimation, we suggest holding some parameters fixed—this leads to the introduction of more parsimonious models. A simulation study is performed to illustrate how the model allows for bursts of probability and locally higher tails. Two further simulation studies illustrate how the model performs on data simulated from multivariate Gaussian distributions and on data from multivariate t-distributions. This novel approach is also applied to real data and the performance of our approach under the various restrictions is discussed.
Statistical computing, multivariate statistics.
P. D. McNicholas, R. P. Browne and M. D. Sparling, "Model-Based Learning Using a Mixture of Mixtures of Gaussian and Uniform Distributions," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 34, no. , pp. 814-817, 2011.