This Article 
 Bibliographic References 
 Add to: 
Hierarchical Growing Cell Structures: TreeGCS
March/April 2001 (vol. 13 no. 2)
pp. 207-218

Abstract—We propose a hierarchical clustering algorithm (TreeGCS) based upon the Growing Cell Structure (GCS) neural network of Fritzke. Our algorithm refines and builds upon the GCS base, overcoming an inconsistency in the original GCS algorithm, where the network topology is susceptible to the ordering of the input vectors. Our algorithm is unsupervised, flexible, and dynamic and we have imposed no additional parameters on the underlying GCS algorithm. Our ultimate aim is a hierarchical clustering neural network that is both consistent and stable and identifies the innate hierarchical structure present in vector-based data. We demonstrate improved stability of the GCS foundation and evaluate our algorithm against the hierarchy generated by an ascendant hierarchical clustering dendogram. Our approach emulates the hierarchical clustering of the dendogram. It demonstrates the importance of the parameter settings for GCS and how they affect the stability of the clustering.

[1] J. Bruske and G. Sommer, “Dynamic Cell Structure Learns Perfectly Topology Preserving Map,” Neural Computation, vol. 7, no. 4, 1995.
[2] V. Burzevski and C.K. Mohan, “Hierarchical Growing Cell Structures,” technical report, Syracuse Univ., Syracuse, N.Y., 1996, an abbreviated version appears in ICNN '96: Proc. Int'l Conf. Neural Networks, June 1996.
[3] B.S. Everitt., Cluster Analysis. Edward Ar nold, 1993.
[4] B. Fritzke, “Kohonen Feature Maps and Growing Cell Structures—A Performance Comparison,” Advances in Neural Information Processing Systems–5–(NIPS '92), C.L. Giles, S.J. Hanson, and J.D. Cowan, eds., 1993.
[5] B. Fritzke, “Growing Cell Structures—A Self-Organizing Network for Unsupervised and Supervised Learning,” Technical Report, TR-93-026, Int'l Computer Science Inst., Berkeley, Calif., 1993.
[6] B. Fritzke, “A Growing Neural Gas Network Learns Topologies,” Advances in Neural Information Processing Systems—7, G. Tesauro, D.S. Touretzky, and T.K. Leen, eds., 1995.
[7] A.K. Jain and R.C. Dubes, Algorithms for Clustering Data. Englewood Cliffs, N.J.: Prentice Hall, 1988.
[8] K. Kaufman and R. Michalski, “A Method for Reasoning with Structured and Continuous Attributes in the INLEN-2 Knowledge Discovery System,” Proc. Second Int'l Conf. Knowledge Discovery and Data Mining (KDD-96), Aug. 1996.
[9] T. Kohonen, Self-Organizing Maps. Berlin: Springer-Verlag, 1995.
[10] M. Kubat, I. Bratko, and R. Michalski, “A Review of Machine Learning Methods,” Machine Learning and Data Mining: Methods and Applications, R.S. Michalski, I. Bratko, and M. Kubat, eds., pp. 3–69, 1998.
[11] P. Mangiameli, S.K. Chen, and D. West, “A Comparison of SOM Neural Network and Hierarchical Clustering Methods,” European J. Operational Research, vol. 93, pp. 402–417, 1996.
[12] M. Köhle and D. Merkl, “Visualizing Similarities in High Dimensional Input Spaces with a Growing and Splitting Neural Network,” Proc. Sixth Int'l Conf. Artificial Neural Networks ( ICANN '96), 1996.
[13] R. Miikkulainen, “Script-Based Inference and Memory Retrieval in Subsymbolic Story Processing,” Applied Intelligence, vol. 5, pp. 137–163, 1995.
[14] D. Opitz and R. Maclin, “Popular Ensemble Methods: An Empirical Study,” J. Artificial Intelligence Research, vol. 11, pp. 169–198, 1999.
[15] H.-H. Song and S.-W. Lee, “A Self-Organizing Neural Tree for Large Set Pattern Classification,” IEEE Trans. Neural Networks, vol. 9, no. 3, May 1998.
[16] R. Weiss, B. Vélez, M.A. Sheldon, C. Namprempre, P. Szilagyi, A. Duda, and D.A. Gifford, “HyPursuit: A Hierarchical Network Search Engine that Exploits Content-Link Hypertext Clustering,” Proc. Seventh ACM Conf. Hypertext, Mar. 1996.
[17] World FactBook, 1997, .

Index Terms:
Unsupervised, growing, neural, network, hierarchical, cluster, topology.
Victoria J. Hodge, Jim Austin, "Hierarchical Growing Cell Structures: TreeGCS," IEEE Transactions on Knowledge and Data Engineering, vol. 13, no. 2, pp. 207-218, March-April 2001, doi:10.1109/69.917561
Usage of this product signifies your acceptance of the Terms of Use.