• Publication
  • 2000
  • Issue No. 10 - October
  • Abstract - ICA Mixture Models for Unsupervised Classification of Non-Gaussian Classes and Automatic Context Switching in Blind Signal Separation
 This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
ICA Mixture Models for Unsupervised Classification of Non-Gaussian Classes and Automatic Context Switching in Blind Signal Separation
October 2000 (vol. 22 no. 10)
pp. 1078-1089

Abstract—An unsupervised classification algorithm is derived by modeling observed data as a mixture of several mutually exclusive classes that are each described by linear combinations of independent, non-Gaussian densities. The algorithm estimates the density of each class and is able to model class distributions with non-Gaussian structure. The new algorithm can improve classification accuracy compared with standard Gaussian mixture models. When applied to blind source separation in nonstationary environments, the method can switch automatically between classes, which correspond to contexts with different mixing properties. The algorithm can learn efficient codes for images containing both natural scenes and text. This method shows promise for modeling non-Gaussian structure in high-dimensional data and has many potential applications.

[1] S. Amari, “Natural Gradient Works Efficiently in Learning,” Neural Computation, vol. 10, no. 2, pp. 251-276, 1998.
[2] S. Amari and J.-F. Cardoso, “Blind Source Separation—Semiparametric Statistical Approach,” IEEE Trans. Signal Processing, vol. 45, no. 11, pp. 2,692-2,700, 1997.
[3] S. Amari, A. Cichocki, and H. Yang, “A New Learning Algorithm for Blind Signal Separation,” Advances in Neural Information Processing Systems 8, pp. 757-763, 1996.
[4] H. Attias, “Independent Factor Analysis,” Neural Computation, vol. 11, no. 4, pp. 803-851, May 1999.
[5] A.J. Bell and T.J. Sejnowski, An Information-Maximization Approach to Blind Separation and Blind Deconvolution Neural Computation, vol. 7, no. 6, June 1995.
[6] A.J. Bell and T.J. Sejnowski, “The“Independent Components”of Natural Scenes are Edge Filters,” Vision Research, vol. 37, no. 23, pp. 3,327-3,338, 1997.
[7] C. Bishop, “Mixture Density Networks,” Technical Report NCRG/4,288, 1994.
[8] J. Cardoso, Infomax and Maximum Likelifood for Blind Source Separation IEEE Signal Processing Letters, vol. 4, no. 4, Apr. 1997.
[9] J. Cardoso, “Blind Signal Separation: Statistical Principles,” Proc. IEEE, vol. 86, pp. 2,009-2,025, 1998.
[10] J.-F. Cardoso and B. Laheld, “Equivariant Adaptive Source Separation,” IEEE Trans. Signal Processing, vol. 45, no. 2, pp. 434-444, 1996.
[11] P. Comon, “Independent Component Analysis, a New Concept?” Signal Processing, vol. 36, no. 3, 1994.
[12] R. Duda and P. Hart, Pattern Classification and Scene Analysis. New York: Wiley, 1973.
[13] R. Fisher, “The Use of Multiple Measurements in Taxonomic Problem,” Ann. Eugenics, vol. 7,part II, pp. 179-188. 1936.
[14] M. Gaeta and J.-L. Lacoume, “Source Separation without Prior Knowledge: The Maximum Likelihood Solution,” Proc. EUSIPO, pp. 621-624, 1990.
[15] Z. Ghahramani and G.E. Hinton, “The EM Algorithm for Mixtures of Factor Analyzers,” technical report, Dept. Computer Science, Univ. Toronto, 1997.
[16] M. Girolami, “Self-Organizing Artificial Neural Networks for Signal Separation,” PhD thesis, Dept. Computing and Information Systems, Paisley Univ., Scotland, 1997.
[17] M. Girolami, An Alternative Perspective on Adaptive Independent Component Analysis Algorithms Neural Computation, vol. 10, no. 8, pp. 2103-2114, 1998.
[18] M. Girolami and C. Fyfe, “Generalised Independent Component Analysis through Unsupervised Learning with Emergent Bussgang Properties,” Proc. ICNN, pp. 1,788-1,891, 1997.
[19] A. Hyvarinen and E. Oja, “A Fast Fixed-Point Algorithm for Independent Component Analysis,” Neural Computation, vol. 9, pp. 1,483-1,492, 1997.
[20] C. Jutten and J. Herault, “Blind Separation of Sources, Part I: An Adaptive Algorithm-Based on Neuromimetic Architecture,” Signal Processing, vol. 24, pp. 1-10, 1991.
[21] T.-W. Lee, M. Girolami, A.J. Bell, and T.J. Sejnowski, “A Unifying Framework for Independent Component Analysis,” Computers and Math. with Applications, vol. 39, no. 11, pp. 1-21, 2000.
[22] T.-W. Lee, M. Girolami, and T.J. Sejnowski, Independent Component Analysis Using an Extended Infomax Algorithm for Mixed Sub-Gaussian and Super-Gaussian Sources Neural Computation, vol. 11, no. 2, Feb. 1999.
[23] T.-W. Lee and M.S. Lewicki, “The Generalized Gaussian Mixture Model Using ICA,” Proc. Int'l Workshop ICA, pp. 239-244, 2000.
[24] T.-W. Lee, M.S. Lewicki, and T.J. Sejnowski, “ICA Mixture Models for Unsupervised Classification and Automatic Context Switching,” Proc. Int'l Workshop ICA, pp. 209-214, 1999.
[25] T.-W. Lee, M.S. Lewicki, and T.J. Sejnowski, “Unsupervised Classification with Non-Gaussian Mixture Models Using ICA,” Advances in Neural Information Processing Systems 11, pp. 508-514, 1999.
[26] M. Lewicki and B. Olshausen, “Inferring Sparse, Overcomplete Image Codes Using an Efficient Coding Framework,” Advances in Neural Information Processing Systems 10, pp. 556-562, 1998.
[27] M. Lewicki and T.J. Sejnowski, “Learning Nonlinear Overcomplete Represenations for Efficient Coding,” Advances in Neural Information Processing Systems 10, pp. 815-821, 1998.
[28] M.S. Lewicki and B.A. Olshausen, “A Probabilistic Framework for the Adaptation and Comparison of Image Codes,” J. Opt. Soc. of Am. A: Optics, Image Science, and Vision, vol. 6, no. 7, pp. 1,587-1,601, 1999.
[29] M.S. Lewicki and T.J. Sejnowski, “Learning Overcomplete Representations,” Neural Computation, vol. 12, no. 2, pp. 337-365, 2000.
[30] D. MacKay, “Maximum Likelihood and Covariant Algorithms for Independent Component Analysis,” report, Univ. Cambridge, Cavendish Lab, 1996.
[31] C. Merz and P. Murphy, “UCI Repository of Machine Learning Databases,” 1998.
[32] B. Olshausen and D. Field, “Emergence of Simple-Cell Receptive Field Properties by Learning a Sparse Code for Natural Images,” Nature, vol. 381, pp. 607-609, 1996.
[33] Probability and Statistics, A. Papoulis, ed., vol. 1, N.J.: Prentice Hall, 1990.
[34] B. Pearlmutter and L. Parra, “A Context-Sensitive Generalization of ICA,” Proc. Int'l Conf. Neural Information Processing, pp. 151-157, 1996.
[35] K. Pearson, “Contributions to the Mathematical Study of Evolution,“ Phil. Trans. Royal Soc. A, vol. 185, no. 71, 1894.
[36] D.-T. Pham and P. Garrat, “Blind Separation of Mixture of Independent Sources through a Quasi-Maximum Likelihood Approach,” IEEE Trans. Signal Processing, vol. 45, no. 7, pp. 1,712-1,725, 1997.
[37] J. Stutz and P. Cheeseman, “Autoclass—A Bayesian Approach to Classification,” Maximum Entropy and Bayesian Methods, Kluwer Academic Publishers,. 1994.

Index Terms:
Unsupervised classification, Gaussian mixture model, independent component analysis, blind source separation, image coding, automatic context switching, maximum likelihood.
Citation:
Te-Won Lee, Michael S. Lewicki, Terrence J. Sejnowski, "ICA Mixture Models for Unsupervised Classification of Non-Gaussian Classes and Automatic Context Switching in Blind Signal Separation," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 10, pp. 1078-1089, Oct. 2000, doi:10.1109/34.879789
Usage of this product signifies your acceptance of the Terms of Use.