
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
Robert Jenssen, "Kernel Entropy Component Analysis," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 32, no. 5, pp. 847860, May, 2010.  
BibTex  x  
@article{ 10.1109/TPAMI.2009.100, author = {Robert Jenssen}, title = {Kernel Entropy Component Analysis}, journal ={IEEE Transactions on Pattern Analysis and Machine Intelligence}, volume = {32}, number = {5}, issn = {01628828}, year = {2010}, pages = {847860}, doi = {http://doi.ieeecomputersociety.org/10.1109/TPAMI.2009.100}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Pattern Analysis and Machine Intelligence TI  Kernel Entropy Component Analysis IS  5 SN  01628828 SP847 EP860 EPD  847860 A1  Robert Jenssen, PY  2010 KW  Spectral data transformation KW  Renyi entropy KW  Parzen windowing KW  kernel PCA KW  clustering KW  pattern denoising. VL  32 JA  IEEE Transactions on Pattern Analysis and Machine Intelligence ER   
[1] R.O. Duda, P.E. Hart, and D.G. Stork, Pattern Classification. John Wiley & Sons, 2001.
[2] S. Theodoridis and K. Koutroumbas, Pattern Recognition. Academic Press, 1999.
[3] I.T. Jolliffe, Principal Component Analysis. Springer Verlag, 1986.
[4] H. Hotelling, "Analysis of a Complex of Statistical Variables into Principal Components," J. Educational Psychology, vol. 24, pp. 417441, 1933.
[5] B. Schölkopf, A.J. Smola, and K.R. Müller, "Nonlinear Component Analysis as a Kernel Eigenvalue Problem," Neural Computation, vol. 10, pp. 12991319, 1998.
[6] H. Zha, X. He, C. Ding, H. Simon, and M. Gu, "Spectral Relaxation for Kmeans Clustering," Advances in Neural Information Processing Systems, 14, pp. 10571064, MIT Press, 2002.
[7] J. MacQueen, "Some Methods for Classification and Analysis of Multivariate Observations," Proc. Berkeley Symp. Math. Statistics and Probability, pp. 281297. 1967.
[8] J.T. Kwok and I.W. Tsang, "The Pre Image Problem in Kernel Methods," IEEE Trans. Neural Networks, vol. 15, no. 6, pp. 15171525, 2004.
[9] S. Mika, B. Schölkopf, A. Smola, K.R. Müller, M. Scholz, and G. Rätsch, "Kernel PCA and Denoising in Feature Space," Advances in Neural Information Processing Systems, 11, pp. 536542, MIT Press, 1999.
[10] B. Schölkopf, S. Mika, C.J.C. Burges, P. Knirsch, K.R. Müller, G. Rätsch, and A.J. Smola, "Input Space versus Feature Space in KernelBased Methods," IEEE Trans. Neural Networks, vol. 10, no. 5, pp. 12991319, 1999.
[11] M.L. Braun, J.M. Buhmann, and K.R. Müller, "On Relevant Dimensions in Kernel Feature Spaces," J. Machine Learning Research, vol. 9, pp. 18751908, 2008.
[12] A.Y. Ng, M. Jordan, and Y. Weiss, "On Spectral Clustering: Analysis and an Algorithm," Advances in Neural Information Processing Systems, 14, pp. 849856, MIT Press, 2002.
[13] M. Belkin and P. Niyogi, "Laplacian Eigenmaps for Dimensionality Reduction and Data Representation," Neural Computation, vol. 15, pp. 13731396, 2003.
[14] J. Shi and J. Malik, "Normalized Cuts and Image Segmentation," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 8, pp. 888905, Aug. 2000.
[15] S. Roweis and L. Saul, "Nonlinear Dimensionality Reduction by Locally Linear Embedding," Science, vol. 290, pp. 23232326, 2000.
[16] J. Tenenbaum, V. de Silva, and J.C. Langford, "A Global Geometric Framework for Nonlinear Dimensionality Reduction," Science, vol. 290, pp. 23192323, 2000.
[17] K.Q. Weinberger and L.K. Saul, "Unsupervised Learning of Image Manifolds by Semidefinite Programming," Int'l J. Computer Vision, vol. 70, no. 1, pp. 7790, 2006.
[18] L.K. Saul, K.Q. Weinberger, J.H. Ham, F. Sha, and D.D. Lee, "Spectral Methods for Dimensionality Reduction," Semisupervised Learning, O. Chapelle, B. Schölkopf, and A. Zien, eds., chapter 1, MIT Press, 2005.
[19] C.J.C. Burges, "Geometric Methods for Feature Extraction and Dimensional Reduction," Data Mining and Knowledge Discovery Handbook: A Complete Guide for Researchers and Practitioners, O. Maimon and L. Rokach, eds., chapter 4, Kluwer Academic Publishers, 2005.
[20] R. Jenssen, T. Eltoft, M. Girolami, and D. Erdogmus, "Kernel Maximum Entropy Data Transformation and an Enhanced Spectral Clustering Algorithm," Advances in Neural Information Processing Systems 19, pp. 633640, MIT Press, 2007.
[21] R. Jenssen and O.K. Storås, "Kernel ECA PreImages for Pattern Denoising," Proc. Scandinavian Conf. Image Analysis, June 2009.
[22] J. ShaweTaylor and N. Cristianini, Kernel Methods for Pattern Analysis. Cambridge Univ. Press, 2004.
[23] J. Mercer, "Functions of Positive and Negative Type and Their Connection with the Theory of Integral Equations," Philosophical Trans. Royal Soc. London, vol. A, pp. 415446, 1909.
[24] K.R. Müller, S. Mika, G. Rätsch, K. Tsuda, and B. Schölkopf, "An Introduction to KernelBased Learning Algorithms," IEEE Trans. Neural Networks, vol. 12, no. 2, pp. 181201, Mar. 2001.
[25] C.K.I. Williams, "On a Connection between Kernel PCA and Metric Multidimensional Scaling," Machine Learning, vol. 46, pp. 1119, 2002.
[26] A. Renyi, "On Measures of Entropy and Information," Selected Papers of Alfred Renyi, vol. 2, pp. 565580, Akademiai Kiado, 1976.
[27] E. Parzen, "On the Estimation of a Probability Density Function and the Mode," The Annals of Math. Statistics, vol. 32, pp. 10651076, 1962.
[28] B.W. Silverman, Density Estimation for Statistics and Data Analysis. Chapman and Hall, 1986.
[29] M. Girolami, "Orthogonal Series Density Estimation and the Kernel Eigenvalue Problem," Neural Computation, vol. 14, no. 3, pp. 669688, 2002.
[30] R. Murphy and D. Ada, "UCI Repository of Machine Learning Databases," technical report, Dept. of Computer Science, Univ. of California, Irvine, 1994.
[31] R. Jenssen and T. Eltoft, "A New Information Theoretic Analysis of SumofSquaredError Kernel Clustering," Neurocomputing, vol. 72, nos. 13, pp. 2331, 2008.