Issue No. 02 - February (2002 vol. 24)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/34.982897
<p><b>Abstract</b>—Clustering is one of the important topics in pattern recognition. Since only the structure of the data dictates the grouping (unsupervised learning), information theory is an obvious criteria to establish the clustering rule. This paper describes a novel valley seeking clustering algorithm using an information theoretic measure to estimate the cost of partitioning the data set. The information theoretic criteria developed here evolved from a Renyi's entropy estimator that was proposed recently and has been successfully applied to other machine learning applications. An improved version of the k-change algorithm is used in optimization because of the stepwise nature of the cost function and existence of local minima. Even when applied to nonlinearly separable data, the new algorithm performs well, and was able to find nonlinear boundaries between clusters. The algorithm is also applied to the segmentation of magnetic resonance imaging data (MRI) with very promising results.</p>
Information theory, clustering, MRI segmentation, entropy, optimization.
Jose C. Principe, Erhan Gokcay, "Information Theoretic Clustering", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 24, no. , pp. 158-171, February 2002, doi:10.1109/34.982897