The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.01 - January (2010 vol.32)
pp: 45-55
Chris Ding , University of Texas at Arlington, Arlington
Tao Li , Florida International University, Miami
Michael I. Jordan , University of California at Berkeley, Berkeley
ABSTRACT
We present several new variations on the theme of nonnegative matrix factorization (NMF). Considering factorizations of the form X=FG^T, we focus on algorithms in which G is restricted to containing nonnegative entries, but allowing the data matrix X to have mixed signs, thus extending the applicable range of NMF methods. We also consider algorithms in which the basis vectors of F are constrained to be convex combinations of the data points. This is used for a kernel extension of NMF. We provide algorithms for computing these new factorizations and we provide supporting theoretical analysis. We also analyze the relationships between our algorithms and clustering algorithms, and consider the implications for sparseness of solutions. Finally, we present experimental results that explore the properties of these new methods.
INDEX TERMS
Nonnegative matrix factorization, singular value decomposition, clustering.
CITATION
Chris Ding, Tao Li, Michael I. Jordan, "Convex and Semi-Nonnegative Matrix Factorizations", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.32, no. 1, pp. 45-55, January 2010, doi:10.1109/TPAMI.2008.277
REFERENCES
[1] D. Lee and H.S. Seung, “Learning the Parts of Objects by Non-Negative Matrix Factorization,” Nature, vol. 401, pp. 788-791, 1999.
[2] D. Lee and H.S. Seung, “Algorithms for Non-Negative Matrix Factorization,” Advances in Neural Information Processing Systems 13, MIT Press, 2001.
[3] P. Paatero and U. Tapper, “Positive Matrix Factorization: A Non-Negative Factor Model with Optimal Utilization of Error Estimates of Data Values,” Environmetrics, vol. 5, pp. 111-126, 1994.
[4] Y.-L. Xie, P. Hopke, and P. Paatero, “Positive Matrix Factorization Applied to a Curve Resolution Problem,” J. Chemometrics, vol. 12, no. 6, pp. 357-364, 1999.
[5] S. Li, X. Hou, H. Zhang, and Q. Cheng, “Learning Spatially Localized, Parts-Based Representation,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, pp. 207-212, 2001.
[6] M. Cooper and J. Foote, “Summarizing Video Using Non-Negative Similarity Matrix Factorization,” Proc. IEEE Workshop Multimedia Signal Processing, pp. 25-28, 2002.
[7] W. Xu, X. Liu, and Y. Gong, “Document Clustering Based on Non-Negative Matrix Factorization,” Proc. ACM Conf. Research and Development in Information Retrieval (SIGIR), pp. 267-273, 2003.
[8] V.P. Pauca, F. Shahnaz, M. Berry, and R. Plemmons, “Text Mining Using Non-Negative Matrix Factorization,” Proc. SIAM Int'l Conf. Data Mining, pp. 452-456, 2004.
[9] J.-P. Brunet, P. Tamayo, T. Golub, and J. Mesirov, “Metagenes and Molecular Pattern Discovery Using Matrix Factorization,” Proc. Nat'l Academy of Sciences USA, vol. 102, no. 12, pp. 4164-4169, 2004.
[10] H. Kim and H. Park, “Sparse Non-Negative Matrix Factorizations via Alternating Non-Negativity-Constrained Least Squares for Microarray Data Analysis,” Bioinformatics, vol. 23, no. 12, pp. 1495-1502, 2007.
[11] D. Greene, G. Cagney, N. Krogan, and P. Cunningham, “Ensemble Non-Negative Matrix Factorization Methods for Clustering Protein-Protein Interactions,” Bioinformatics, vol. 24, no. 15, pp.1722-1728, 2008.
[12] I. Dhillon and S. Sra, “Generalized Nonnegative Matrix Approximations with Bregman Divergences,” Advances in Neural Information Processing Systems 17, MIT Press, 2005.
[13] C. Ding, T. Li, and W. Peng, “Nonnegative Matrix Factorization and Probabilistic Latent Semantic Indexing: Equivalence, Chi-Square Statistic, and a Hybrid Method,” Proc. Nat'l Conf. Artificial Intelligence, 2006.
[14] F. Sha, L.K. Saul, and D.D. Lee, “Multiplicative Updates for Nonnegative Quadratic Programming in Support Vector Machines,” Advances in Neural Information Processing Systems 15, MIT Press, 2003.
[15] N. Srebro, J. Rennie, and T. Jaakkola, “Maximum Margin Matrix Factorization,” Advances in Neural Information Processing Systems, MIT Press, 2005.
[16] P.O. Hoyer, “Non-Negative Matrix Factorization with Sparseness Constraints,” J. Machine Learning Research, vol. 5, pp. 1457-1469, 2004.
[17] M. Berry, M. Browne, A. Langville, P. Pauca, and R. Plemmons, “Algorithms and Applications for Approximate Nonnegative Matrix Factorization,” Computational Statistics and Data Analysis, 2006.
[18] T. Li and S. Ma, “IFD: Iterative Feature and Data Clustering,” Proc. SIAM Int'l Conf. Data Mining, pp. 472-476, 2004.
[19] T. Li, “A General Model for Clustering Binary Data,” Proc. Knowledge Discovery and Data Mining, pp. 188-197, 2005.
[20] C. Ding, X. He, and H. Simon, “On the Equivalence of Nonnegative Matrix Factorization and Spectral Clustering,” Proc. SIAM Data Mining Conf., 2005.
[21] E. Gaussier and C. Goutte, “Relation between PLSA and NMF and Implications,” Proc. ACM Conf. Research and Development in Information Retrieval (SIGIR), pp. 601-602, 2005.
[22] T. Hofmann, “Probabilistic Latent Semantic Indexing,” Proc. ACM Conf. Research and Development in Information Retrieval (SIGIR), pp.50-57, 1999.
[23] D. Blei, A. Ng, and M. Jordan, “Latent Dirichlet Allocation,” J.Machine Learning Research, vol. 3, pp. 993-1022, 2003.
[24] M. Girolami and K. Kaban, “On an Equivalence between PLSI and LDA,” Proc. ACM Conf. Research and Development in Informational Retrieval (SIGIR), 2003.
[25] D. Lee and H.S. Seung, “Unsupervised Learning by Convex and Conic Coding,” Advances in Neural Information Processing Systems 9, MIT Press, 1997.
[26] L. Xu and M. Jordan, “On Convergence Properties of the EM Algorithm for Gaussian Mixtures,” Neural Computation, vol. 18, pp. 129-151, 1996.
[27] C. Boutsidis and E. Gallopoulos, “SVD Based Initialization: A Head Start for Nonnegative Matrix Factorization,” Pattern Recognition, vol. 41, no. 4, pp. 1350-1362, 2008.
[28] D. Donoho and V. Stodden, “When Does Non-Negative Matrix Factorization Give a Correct Decomposition into Parts?” Advances in Neural Information Processing Systems 16, MIT Press, 2004.
[29] A. D'Aspremont, L.E. Ghaoui, M.I. Jordan, and G.R.G. Lanckriet, “A Direct Formulation for Sparse PCA Using Semidefinite Programming,” SIAM Rev., 2006.
[30] H. Zou, T. Hastie, and R. Tibshirani, “Sparse Principal Component Analysis,” J. Computational and Graphical Statistics, vol. 15, pp. 265-286, 2006.
[31] Z. Zhang, H. Zha, and H. Simon, “Low-Rank Approximations with Sparse Factors II: Penalized Methods with Discrete Newton-Like Iterations,” SIAM J. Matrix Analysis Applications, vol. 25, pp.901-920, 2004.
[32] H. Zha, C. Ding, M. Gu, X. He, and H. Simon, “Spectral Relaxation for K-Means Clustering,” Advances in Neural Information Processing Systems 14, pp. 1057-1064, 2002.
[33] C. Ding and X. He, “K-Means Clustering and Principal Component Analysis,” Proc. Int'l Conf. Machine Learning, 2004.
24 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool