This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Weighted Graph Cuts without Eigenvectors A Multilevel Approach
November 2007 (vol. 29 no. 11)
pp. 1944-1957
A variety of clustering algorithms have recently been proposed to handle data that is not linearly separable; spectral clustering and kernel k-means are two of the main methods. In this paper, we discuss an equivalence between the objective functions used in these seemingly different methods--in particular, a general weighted kernel k-means objective is mathematically equivalent to a weighted graph clustering objective. We exploit this equivalence to develop a fast, high-quality multilevel algorithm that directly optimizes various weighted graph clustering objectives, such as the popular ratio cut, normalized cut, and ratio association criteria. This eliminates the need for any eigenvector computation for graph clustering problems, which can be prohibitive for very large graphs. Previous multilevel graph partitioning methods, such as Metis, have suffered from the restriction of equal-sized clusters; our multilevel algorithm removes this restriction by using kernel k-means to optimize weighted graph cuts. Experimental results show that our multilevel algorithm outperforms a state-of-the-art spectral clustering algorithm in terms of speed, memory usage, and quality. We demonstrate that our algorithm is applicable to large-scale clustering tasks such as image segmentation, social network analysis and gene network analysis.

[1] B. Schölkopf, A. Smola, and K.-R. Müller, “Nonlinear Component Analysis as a Kernel Eigenvalue Problem,” Neural Computation, vol. 10, pp. 1299-1319, 1998.
[2] J. MacQueen, “Some Methods for Classification and Analysis of Multivariate Observations,” Proc. Fifth Berkeley Symp. Math. Statistics and Probability, pp. 281-296, 1967.
[3] W.E. Donath and A.J. Hoffman, “Lower Bounds for the Partitioning of Graphs,” IBM J. Research and Development, vol. 17, pp. 422-425, 1973.
[4] K.M. Hall, “An R-Dimensional Quadratic Placement Algorithm,” Management Science, vol. 11, no. 3, pp. 219-229, 1970.
[5] P. Chan, M. Schlag, and J. Zien, “Spectral $k$ -Way Ratio Cut Partitioning,” IEEE Trans. CAD-Integrated Circuits and Systems, vol. 13, pp. 1088-1096, 1994.
[6] J. Shi and J. Malik, “Normalized Cuts and Image Segmentation,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 8, pp. 888-905, Aug. 2000.
[7] A.Y. Ng, M. Jordan, and Y. Weiss, “On Spectral Clustering: Analysis and an Algorithm,” Proc. 14th Advances in Neural Information Processing Systems, 2001.
[8] F. Bach and M. Jordan, “Learning Spectral Clustering,” Proc. 17th Advances in Neural Information Processing Systems, 2004.
[9] I. Dhillon, Y. Guan, and B. Kulis, “Kernel $k$ -Means, Spectral Clustering and Normalized Cuts,” Proc. 10th ACM Knowledge Discovery and Data Mining Conf., pp. 551-556, 2004.
[10] H. Zha, C. Ding, M. Gu, X. He, and H. Simon, “Spectral Relaxation for $k$ -Means Clustering,” Proc. Neural Information Processing Systems, 2001.
[11] V. Roth, J. Laub, M. Kawanabe, and J. Buhmann, “Optimal Cluster Preserving Embedding of Non-Metric Proximity Data,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 25, no. 12, Dec. 2003.
[12] G. Karypis and V. Kumar, “A Fast and High Quality Multilevel Scheme for Partitioning Irregular Graphs,” SIAM J. Scientific Computing, vol. 20, no. 1, pp. 359-392, 1999.
[13] B. Hendrickson and R. Leland, “A Multilevel Algorithm for Partitioning Graphs,” Technical Report SAND93-1301, Sandia Nat'l Laboratories, 1993.
[14] B. Kernighan and S. Lin, “An Efficient Heuristic Procedure for Partitioning Graphs,” The Bell System Technical J., vol. 49, no. 2, pp.291-307, 1970.
[15] N. Cristianini and J. Shawe-Taylor, Introduction to Support Vector Machines: And Other Kernel-Based Learning Methods. Cambridge Univ. Press, 2000.
[16] S.X. Yu and J. Shi, “Multiclass Spectral Clustering,” Proc. Int'l Conf. Computer Vision, 2003.
[17] R. Zass and A. Shashua, “A Unifying Approach to Hard and Probabilistic Clustering,” Proc. 10th IEEE Conf. Computer Vision, 2005.
[18] G. Golub and C. Van Loan, Matrix Computations. Johns Hopkins Univ. Press, 1989.
[19] I. Dhillon, Y. Guan, and B. Kulis, “A Unified View of Kernel $k$ -Means, Spectral Clustering and Graph Cuts,” Technical Report TR-04-25, Univ. of Texas at Austin, 2004.
[20] I.S. Dhillon, Y. Guan, and J. Kogan, “Iterative Clustering of High Dimensional Text Data Augmented by Local Search,” Proc. 2002 IEEE Int'l Conf. Data Mining, pp. 131-138, 2002.
[21] M. Strong, T. Graeber, M. Beeby, M. Pellegrini, M. Thompson, T. Yeates, and D. Eisenberg, “Visualization and Interpretation of Protein Networks in Mycobacterium Tuberculosis Based on Hierarchical Clustering of Genome-Wide Functional Linkage Maps,” Nucleic Acids Research, vol. 31, no. 24, pp. 7099-7109, 2003.
[22] E.M. Marcotte, M. Pellegrini, H.-L. Ng, D.W. Rice, T.O. Yeates, and D. Eisenberg, “Detecting Protein Function and Protein-Protein Interactions from Genome Sequences,” Science, vol. 285, pp. 751-753, 1999.
[23] M. Pellegrini, E.M. Marcotte, M.J. Thompson, D. Eisenberg, and T.O. Yeates, “Detecting the Components of Protein Complexes and Pathways by Comparative Genome Analysis: Protein Phylogenetic Profiles,” Proc. Nat'l Academy of Sciences, vol. 96, pp. 4285-4288, 1999.
[24] M. Strong, P. Mallick, M. Pellegrini, M. Thompson, and D. Eisenberg, “Inference of Protein Function and Protein Linkages in Mycobacterium Tuberculosis Based on Prokaryotic Genome Organization: A Combined Computational Approach,” Genome Biology, vol. 4, R59.1-16, 2003.
[25] R. Overbeek, M. Fonstein, M. D'Souza, G. Pusch, and N. Maltsev, “The Use of Gene Clusters to Infer Functional Coupling,” Proc. Nat'l Academy of Sciences, vol. 96, no. 6, pp. 2896-2901, 1999.
[26] T. Davis, “Univ. of Florida Sparse Matrix Collection,” vol. 92, no. 42, 1994, vol. 96, no. 28, 1996; vol. 97, no. 23, 1997, http://www.cise.ufl.edu/research/sparsematrices .
[27] M. Girolami, “Mercer Kernel Based Clustering in Feature Space,” IEEE Trans. Neural Networks, vol. 13, no. 4, pp. 669-688, 2002.
[28] N. Cristianini, J. Shawe-Taylor, and J. Kandola, “Spectral Kernel Methods for Clustering,” Proc. 14th Advances in Neural Information Processing Systems, 2001.
[29] C. Chennubhotla and A. Jepson, “Hierarchical Eigensolver for Transition Matrices in Spectral Methods,” Proc. 17th Advances in Neural Information Processing Systems, 2004.
[30] I. Dhillon, Y. Guan, and B. Kulis, “A Fast Kernel-Based Multilevel Algorithm for Graph Clustering,” Proc. 11th ACM Knowledge Discovery and Data Mining Conf., pp. 629-634, 2005.

Index Terms:
Clustering, Data Mining, Segmentation, Kernel, k-means, k-means, Spectral Clustering, Graph Partitioning
Citation:
Inderjit S. Dhillon, Yuqiang Guan, Brian Kulis, "Weighted Graph Cuts without Eigenvectors A Multilevel Approach," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 29, no. 11, pp. 1944-1957, Nov. 2007, doi:10.1109/TPAMI.2007.1115
Usage of this product signifies your acceptance of the Terms of Use.