This Article 
 Bibliographic References 
 Add to: 
On Rival Penalization Controlled Competitive Learning for Clustering with Automatic Cluster Number Selection
November 2005 (vol. 17 no. 11)
pp. 1583-1588
The existing Rival Penalized Competitive Learning (RPCL) algorithm and its variants have provided an attractive way to perform data clustering without knowing the exact number of clusters. However, their performance is sensitive to the preselection of the rival delearning rate. In this paper, we further investigate the RPCL and present a mechanism to control the strength of rival penalization dynamically. Consequently, we propose the Rival Penalization Controlled Competitive Learning (RPCCL) algorithm and its stochastic version. In each of these algorithms, the selection of the delearning rate is circumvented using a novel technique. We compare the performance of RPCCL to RPCL in Gaussian mixture clustering and color image segmentation, respectively. The experiments have produced the promising results.

[1] S.C. Ahalt, A.K. Krishnamurty, P. Chen, and D.E. Melton, “Competitive Learning Algorithms for Vector Quantization,” Neural Networks, vol. 3, pp. 277-291, 1990.
[2] H. Akaike, “Information Theory and an Extension of the Maximum Likelihood Principle,” Proc. Second Int'l Symp. Information Theory, pp. 267-281, 1973.
[3] H. Akaike, “A New Look at the Statistical Model Identfication,” IEEE Trans. Automatic Control, pp. 716-723, 1974.
[4] H. Bozdogan, “Model Selection and Akaike's Information Criterion: The General Theory and Its Analytical Extensions,” Psychometrika, vol. 52, no. 3, pp. 345-370, 1987.
[5] J. Breckenridge, “Replicating Clustering Analysis: Method, Consistency and Validity,” Multivariate Behavioural Research, 1989.
[6] E.W. Forgy, “Cluster Analysis of Multivariate Data: Efficiency Versus Interpretability of Classifications,” Proc. Biometric Soc. Meetings, 1965.
[7] J. Fridlyand and S. Dudoit, “Applications of Resampling Methods to Estimate the Number of Clusters and to Improve the Accuracy of a Clustering Method,” Technical Report 600, Statistics Dept., UC Berkeley, Sept. 2001.
[8] B. Fritzke, “Growing Cell Structures— A Self-Organizing Network for Unsupervised and Supervised Learning,” Neural Networks, vol. 7, no. 9, pp. 1441-1460, 1994.
[9] R.M. Gray, “Vector Quantization,” IEEE ASSP Magazine, vol. 1, pp. 4-29, 1984.
[10] P. Guo, C.L. Philip Chen, and M.R. Lye, “Cluster Number Selection for a Small Set of Samples Using the Bayesian Ying-Yang Model,” IEEE Trans. Neural Networks, vol. 13, no. 3, pp. 757-763, 2002.
[11] M. Har-even and V.L. Brailovsky, “Probabilistic Validation Approach for Clustering,” Pattern Recognition Letters, vol. 16, pp. 1189-1196, 1995.
[12] T. Lange, M.L. Braun, V. Roth, and J.M. Buhmann, “Stability-Based Model Selection,” Proc. Neural Information Processing Systems, Paper ID: AA17, 2002.
[13] T. Kohonen, “Self-Organized Formation of Topologically Correct Feature Maps,” Biological Cybernetics, vol. 43, pp. 59-69, 1982.
[14] Y. Linde, A. Buzo, and R.M. Gray, “An Algorithm for Vector Quantizer Design,” IEEE Trans. Comm., pp. 84-95, 1980.
[15] J.B. MacQueen, “Some Methods for Classification and Analysis of Multivariate Observations,” Proc. Fifth Berkeley Symp. Math. Statistics and Probability, vol. 1, Berkeley, Calif.: Univ. of California Press, pp. 281-297, 1967.
[16] T.M. Martinetz and K.J. Schulten, “A ‘Neural-Gas’ Network Learns Topologies,” Artificial Neural Networks, pp. 397-402, 1991.
[17] V. Olman, D. Xu, and Y. Xu, “Cubic: Identification of Regulatory Binding Sites Through Data Clustering,” J. Bioinformatics and Computational Biology, vol. 1, no. 1, pp. 21-40, 2003.
[18] G. Schwarz, “Estimating the Dimension of a Model,” The Annals of Statistics, vol. 6, no. 2, pp. 461-464, 1978.
[19] R. Tibshirani, G. Walther, and T. Hastie, “Estimating the Number of Clusters via the Gap Statistic,” J. Royal Statistical Soc., Series B, vol. 63, pp. 411-423, 2001.
[20] R. Tibshirani, G. Walther, D. Botstein, and P. Brown, “Cluster Validation by Prediction Strength,” technical report, Statistics Dept., Stanford Univ., Sept., 2001.
[21] H.Q. Wang and D.S. Huang, “A Novel Clustering Analysis Based on PCA and SOMs for Gene Expression Patterns,” Lecture Notes in Computer Science, vol. 3174, pp. 476-481, Springer-Verlag, 2004.
[22] L. Xu, “How Many Clusters?: A Ying-Yang Machine Based Theory for a Classical Open Problem in Pattern Recognition,” Proc. IEEE Int'l Conf. Neural Networks, vol. 3, pp. 1546-1551, 1996.
[23] L. Xu, “Bayesian Ying-Yang Machine, Clustering and Number of Clusters,” Pattern Recognition Letters, vol. 18, nos. 11-13, pp. 1167-1178, 1997.
[24] L. Xu, A. Krzyżak, and E. Oja, “Rival Penalized Competitive Learning for Clustering Analysis, RBF Net, and Curve Detection,” IEEE Trans. Neural Networks, vol. 4, pp. 636-648, 1993.

Index Terms:
Index Terms- Rival Penalization Controlled Competitive Learning, stochastic RPCL, clustering, cluster number.
Yiu-ming Cheung, "On Rival Penalization Controlled Competitive Learning for Clustering with Automatic Cluster Number Selection," IEEE Transactions on Knowledge and Data Engineering, vol. 17, no. 11, pp. 1583-1588, Nov. 2005, doi:10.1109/TKDE.2005.184
Usage of this product signifies your acceptance of the Terms of Use.