This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Unsupervised Selection of a Finite Dirichlet Mixture Model: An MML-Based Approach
August 2006 (vol. 18 no. 8)
pp. 993-1009
This paper proposes an unsupervised algorithm for learning a finite Dirichlet mixture model. An important part of the unsupervised learning problem is determining the number of clusters which best describe the data. We extend the minimum message length (MML) principle to determine the number of clusters in the case of Dirichlet mixtures. Parameter estimation is done by the expectation-maximization algorithm. The resulting method is validated for one-dimensional and multidimensional data. For the one-dimensional data, the experiments concern artificial and real SAR image histograms. The validation for multidimensional data involves synthetic data and two real applications: shadow detection in images and summarization of texture image databases for efficient retrieval. A comparison with results obtained for other selection criteria is provided.

[1] A.K. Jain, R.P.W. Duin, and J. Mao, “Statistical Pattern Recognition: A Review,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 1, pp. 4-37, Jan. 2000.
[2] G.J. McLachlan and D. Peel, Finite Mixture Models. New York: Wiley, 2000.
[3] B.S. Everitt and D.J. Hand, Finite Mixture Distributions. London: Chapman and Hall, 1981.
[4] M.A.T. Figueiredo and A.K. Jain, “Unsupervised Learning of Finite Mixture Models,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, no. 3, pp. 4-37, Mar. 2002.
[5] C.S Wallace and D.M. Boulton, “An Information Measure for Classification,” The Computer J., vol. 11, no. 2, pp. 195-209, 1968.
[6] H. Akaike, “A New Look at the Statistical Model Identification,” IEEE Trans. Automatic Control, vol. 19, no. 6, pp. 716-723, 1974.
[7] J. Rissanen, “Modeling by Shortest Data Description,” Automatica, vol. 14, pp. 465-471, 1987.
[8] J.C. Bezdek, Pattern Recognition with Fuzzy Objective Function Algorithms. Plenum Press, 1981.
[9] I. Gath and B. Geva, “Unsupervised Optimal Fuzzy Clustering,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 11, no. 7, pp. 773-781, July 1989.
[10] J.O. Berger and L.R. Pericchi, “The Intrinsic Bayes Factor for Linear Models. Invited Paper,” Proc. Fifth Int'l Meeting Bayesian Statistics, pp. 237-273, 1994.
[11] K. Roeder and L. Wasserman, “Practical Bayesian Density Estimation Using Mixture of Normals,” J. Am. Statistical Assoc., vol. 92, pp. 894-902, 1997.
[12] P. Cheeseman and J. Stuz, “Bayesian Classification (AUTO-CLASS): Theory and Results,” Advances in Knowledge Discovery and Data Mining, pp. 237-273, 1996.
[13] R.A. Baxter and J.J. Oliver, “Finding Overlapping Components with MML,” Statistics and Computing, vol. 10, no. 1, pp. 5-16, 2000.
[14] S.J. Roberts, D. Husmeier, I. Rezek, and W. Penny, “Bayesian Approaches to Gaussian Mixture Modeling,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 20, no. 11, pp. 1133-1142, Nov. 1998.
[15] D.M. Boulton and C.S. Wallace, “A Program for Numerical Classification,” The Computer J., vol. 13, no. 1, pp. 63-69, 1970.
[16] C.S. Wallace and D.L. Dowe, “MML Clustering of Multi-State, Poisson, von Mises Circular and Gaussian Distributions,” Statistics and Computing, vol. 10, no. 1, pp. 73-83, 2000.
[17] D. Ziou and N. Bouguila, “Unsupervised Learning of a Gamma Finite Mixture Using MML: Application to SAR Image Analysis,” Proc. 17th Int'l Conf. Pattern Recognition (ICPR '04), pp. 280-283, 2004.
[18] Y. Agusta and D.L. Dowe, “MML Clustering of Continuous-Valued Data Using Gaussian and t Distributions,” Proc. Australian Joint Conf. Artificial Intelligence, pp. 143-154, 2002.
[19] N. Bouguila, D. Ziou, and J. Vaillancourt, “Unsupervised Learning of a Finite Mixture Model Based on the Dirichlet Distribution and Its Application,” IEEE Trans. Image Processing, vol. 13, no. 11, pp. 1533-1543, Nov. 2004.
[20] K. Sjolander, K. Karplus, M. Brown, R. Hughey, A. Krogh, I.S. Mian, and D. Haussler, “Dirichlet Mixtures: A Method for Improving Detection of Weak but Significant Protein Sequence Homology,” CABIOS, vol. 12, no. 4, pp. 327-345, 1996.
[21] D. Blei, A. Ng, and M. Jordan, “Latent Dirichlet Allocation,” J. Machine Learning Research, vol. 3, pp. 993-1022, 2003.
[22] C.S. Wallace and J.D. Patrick, “Coding Decision Trees,” Machine Learning, vol. 11, pp. 7-22, 1993.
[23] J.J. Oliver and C.S. Wallace, “Inferring Decision Graphs Using the Minimum Message Length Principle,” Proc. Fifth Joint Conf. Artificial Intelligence, pp. 361-367, 1992.
[24] C.S. Wallace and P. Freeman, “Estimation and Inference via Compact Coding,” J. Royal Statistical Soc., B, vol. 49, no. 3, pp. 241-252, 1987.
[25] C.E. Shannon, “A Mathematical Theory of Communication,” Bell Systems Technical J., vol. 27, pp. 379-423, 1948.
[26] J. Conway and N. Solane, Sphere Packings, Lattice, and Groups. New York: Springer Verlag, 1993.
[27] F.A. Graybill, Matrices with Applications in Statistics. Wadsworth, 1983.
[28] C.P. Robert, The Bayesian Choice. Springer, 2001.
[29] W. Jeffreys and J. Berger, “Ockham's Razor and Bayesian Analysis,” Am. Scientist, vol. 80, pp. 64-72, 1992.
[30] N. Bouguila, D. Ziou, and J. Vaillancourt, “Novel Mixtures Based on the Dirichlet Distribution: Application to Data and Image Classification,” Machine Learning and Data Mining in Pattern Recognition (MLDM '03), pp. 172-181, July 2003.
[31] G.J. McLachlan and T. Krishnan, The EM Algorithm and Extensions. New York: Wiley-Interscience, 1997.
[32] A.P. Dempster, N.M. Laird, and D.B. Rubin, “Maximum Likelihood from Incomplete Data via the EM Algorithm,” J. Royal Statistical Soc., B, vol. 39, no. 1, pp. 1-38, 1977.
[33] S. Chretien and A. Hero, “Kullback Proximal Algorithms for Maximum Likelihood Estimation,” IEEE Trans. Information Theory, vol. 46, no. 5, pp. 1800-1810, 2000.
[34] H. Bozdogan, “Determining the Number of Component Clusters in the Standard Multivariate Normal Mixture Model Using Model Selection Criteria,” Technical Report A83-1, Quantitative Methods Dept., Univ. of Illi nois, 1983.
[35] E. Salvador, A. Cavallaro, and T. Ebrahimi, “Cast Shadows Segmentation Using Invariant Color Features,” Computer Vision and Image Understanding, vol. 95, pp. 238-259, 2004.
[36] R. Irvin and D. Mckeown, “Methods for Exploiting the Relationship between Buildings and Their Shadows in Aerial Imagery,” IEEE Trans. Systems, Man, and Cybernetics, vol. 19, pp. 1564-1575, 1989.
[37] N. Friedman and S. Russel, “Image Segmentation in Video Sequences: A Probabilistic Approach,” Proc. 13th Ann. Conf. Uncertainty in Artificial Intelligence (UAI-97), pp. 175-181, 1997.
[38] D. Koller, J. Weber, T. Huang, J. Malik, G. Ogasawara, B. Rao, and S. Russel, “Towards Robust Automatic Traffic Scene Analysis in Real-Time,” Proc. 12th Int'l Conf. Pattern Recognition, pp. 126-131, 1994.
[39] T. Gevers and A.W.M. Smeulders, “Color-Based Object Recognition,” Pattern Recognition, vol. 32, pp. 453-464, 1999.
[40] E. Salvador, A. Cavallaro, and T. Ebrahimi, “Shadow Identification and Classification Using Invariant Color Models,” Proc. IEEE Int'l Conf. Acoustics, Speech, and Signal Processing (ICASSP '01), May 2001.
[41] J. Stauder, R. Mech, and J. Ostermann, “Detection of Moving Cast Shadows for Object Segmentation,” IEEE Trans. Multimedia, vol. 1, no. 1, pp. 65-76, 1999.
[42] W. Niblack, R. Barber, W. Equitz, M. Flickner, E.H. Glasman, D. Yanker, P. Faloutsos, and G. Taubin, “The QBIC Project: Querying Images by Content Using Color, Texture and Shape,” Proc. SPIE Conf. Storage and Retrieval for Images and Video Databases, pp. 173-187, 1993.
[43] J.R. Smith and S.F. Chang, “VisualSEEK: A Fully Automated Content-Based Image Query System,” Proc. ACM Int'l Conf. Multimedia, pp. 87-98, Nov. 1996.
[44] M.L. Kherfi, D. Ziou, and A. Bernardi, “Combining Positive and Negative Examples in Relevance Feedback for Content-Based Image Retrieval,” J. Visual Comm. and Image Representation, vol. 14, pp. 428-457, 2003.
[45] R.M. Haralick, K. Shanmugan, and I. Dinstein, “Texture Features for Image Classification,” IEEE Trans. Systems, Man, and Cybernetics, vol. 8, pp. 610-621, 1973.
[46] M. Unser, “Sum and Difference Histograms for Texture Classification,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 8, no. 1, pp. 118-125, 1986.

Index Terms:
Finite mixture models, Dirichlet mixture, EM, MML, SAR images, shadow modeling, texture summarization.
Citation:
Nizar Bouguila, Djemel Ziou, "Unsupervised Selection of a Finite Dirichlet Mixture Model: An MML-Based Approach," IEEE Transactions on Knowledge and Data Engineering, vol. 18, no. 8, pp. 993-1009, Aug. 2006, doi:10.1109/TKDE.2006.133
Usage of this product signifies your acceptance of the Terms of Use.