The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.04 - April (2011 vol.33)
pp: 657-670
Minyoung Kim , Seoul National University of Science and Technology, Seoul
Vladimir Pavlovic , Rutgers University, Piscataway
ABSTRACT
We consider the task of dimensionality reduction informed by real-valued multivariate labels. The problem is often treated as Dimensionality Reduction for Regression (DRR), whose goal is to find a low-dimensional representation, the central subspace, of the input data that preserves the statistical correlation with the targets. A class of DRR methods exploits the notion of inverse regression (IR) to discover central subspaces. Whereas most existing IR techniques rely on explicit output space slicing, we propose a novel method called the Covariance Operator Inverse Regression (COIR) that generalizes IR to nonlinear input/output spaces without explicit target slicing. COIR's unique properties make DRR applicable to problem domains with high-dimensional output data corrupted by potentially significant amounts of noise. Unlike recent kernel dimensionality reduction methods that employ iterative nonconvex optimization, COIR yields a closed-form solution. We also establish the link between COIR, other DRR techniques, and popular supervised dimensionality reduction methods, including canonical correlation analysis and linear discriminant analysis. We then extend COIR to semi-supervised settings where many of the input points lack their labels. We demonstrate the benefits of COIR on several important regression problems in both fully supervised and semi-supervised settings.
INDEX TERMS
Dimensionality reduction, supervised learning, kernel methods, regression.
CITATION
Minyoung Kim, Vladimir Pavlovic, "Central Subspace Dimensionality Reduction Using Covariance Operators", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.33, no. 4, pp. 657-670, April 2011, doi:10.1109/TPAMI.2010.111
REFERENCES
[1] J.I. Antonov, R.A. Locarnini, T.P. Boyer, A.V. Mishonov, and H.E. Garcia, "World Ocean Atlas 2005. Volume 2: Salinity," NOAA Atlas NESDIS, S. Levitus, ed., vol. 62, pp. 1-182, US Government Printing Office, 2006.
[2] C.R. Baker, "Joint Measures and Cross-Covariance Operators," Trans. Am. Math. Soc., vol. 186, pp. 273-289, 1973.
[3] M. Barker and W. Rayens, "Partial Least Squares for Discrimination," J. Chemometrics, vol. 17, no. 3, pp. 166-173, 2003.
[4] M. Belkin and P. Niyogi, "Laplacian Eigenmaps for Dimensionality Reduction and Data Representation," Neural Computation, vol. 15, no. 6, pp. 1373-1396, 2003.
[5] M. Belkin, P. Niyogi, and V. Sindhwani, "On Manifold Regularization," Proc. Artificial Intelligence and Statistics, 2005.
[6] R.D. Cook, Regression Graphics. Wiley InterScience, 1998.
[7] K. Fukumizu, F. Bach, and M. Jordan, "Dimensionality Reduction for Supervised Learning with Reproducing Kernel Hilbert Spaces," J. Machine Learning Research, vol. 5, pp. 73-99, 2004.
[8] K. Fukumizu, F.R. Bach, and M.I. Jordan, "Kernel Dimension Reduction in Regression," Annals of Statistics, vol. 37, pp. 1871-1905, 2009.
[9] H.E. Garcia, R.A. Locarnini, T.P. Boyer, and J.I. Antonov, "World Ocean Atlas 2005. Volume 3: Dissolved Oxygen, Apparent Oxygen Utilization, and Oxygen Saturation," NOAA Atlas NESDIS, S. Levitus, ed., vol. 63, pp. 1-342, US Government Printing Office, 2006.
[10] H.E. Garcia, R.A. Locarnini, T.P. Boyer, and J.I. Antonov, "World Ocean Atlas 2005. Volume 4: Nutrients (Phosphate, Nitrate, Silicate)," NOAA Atlas NESDIS, S. Levitus, ed., vol. 64, pp. 1-396, US Government Printing Office, 2006.
[11] A. Globerson and S. Roweis, "Metric Learning by Collapsing Classes," Proc. Neural Information Processing Systems, 2005.
[12] J. Goldberger, S. Roweis, G. Hinton, and R. Salakhutdinov, "Neighbourhood Component Analysis," Proc. Neural Information Processing Systems, 2004.
[13] D. Hardoon, S. Szedmak, and J. Shawe-Taylor, "Canonical Correlation Analysis; An Overview with Application to Learning Methods," Neural Computation, vol. 16, no. 12, pp. 2639-2664, 2004.
[14] S.-Y. Huang, C.-R. Hwang, and M.-H. Lin, "Kernel Fisher's Discriminant Analysis in Gaussian Reproducing Kernel Hilbert Space," manuscript (http://www.stat.sinica.edu.twsyhuang/), 2006.
[15] A. Kanaujia, Y. Huang, and D.N. Metaxas, "Tracking Facial Features Using Mixture of Point Distribution Models," Proc. Computer Vision, Graphics, and Image Processing, 2006.
[16] N.D. Lawrence, "The Gaussian Process Latent Variable Model," Technical Report CS-06-03, Dept. of Computer Science, Univ. of Sheffield, 2006.
[17] Y. LeCun, B. Boser, J.S. Denker, D. Henderson, R.E. Howard, W. Hubbard, and L.D. Jackel, "Handwritten Digit Recognition with a Back-Propagation Network," Proc. Neural Information Processing Systems, 1989.
[18] J.M. Lewis, P.M. Hull, K.Q. Weinberger, and L.K. Saul, "Mapping Uncharted Waters: Exploratory Analysis, Visualization, and Clustering of Oceanographic Data," Proc. Int'l Conf. Machine Learning and Applications, 2008.
[19] K.-C. Li, "Sliced Inverse Regression for Dimension Reduction," J. Am. Statistical Assoc., vol. 86, pp. 316-327, 1991.
[20] R.A. Locarnini, A.V. Mishonov, J.I. Antonov, T.P. Boyer, and H.E. Garcia, "World Ocean Atlas 2005. Volume 1: Temperature," NOAA Atlas NESDIS, S. Levitus, ed., vol. 61, pp. 1-182, US Government Printing Office, 2006.
[21] J. Nilsson, "Manifold Learning in Computational Biology," Doctoral theses in Math. Sciences, Lund Univ., Faculty of Eng., 2008.
[22] J. Nilsson, F. Sha, and M. Jordan, "Regression on Manifolds Using Kernel Dimension Reduction," Proc. Int'l Conf. Machine Learning, 2007.
[23] S. Roweis and L. Saul, "Nonlinear Dimensionality Reduction by Locally Linear Embedding," Science, vol. 290, no. 5500, pp. 2323-2326, 2000.
[24] B. Schölkopf, A.J. Smola, and K.-R. Muller, "Nonlinear Component Analysis as a Kernel Eigenvalue Problem," Neural Computation, vol. 10, no. 5, pp. 1299-1319, 1998.
[25] B. Schölkopf and A. Smola, Learning with Kernels. MIT Press, 2002.
[26] J.B. Tenenbaum, V.D. Silva, and J.C. Langford, "A Global Geometric Framework for Nonlinear Dimensionality Reduction," Science, vol. 290, no. 5500, pp. 2319-2323, 2000.
[27] N.N. Vakhaniia, V.I. Tarieladeze, and S.A. Chobanyan, Probability Distributions on Banach Spaces. D. Reidel Publishing Company, 1987.
[28] G. Wahba, Spline Models for Observational Data. SIAM, 1990.
[29] K. Weinberger, J. Blitzer, and L. Saul, "Distance Metric Learning for Large Margin Nearest Neighbor Classification," Proc. Neural Information Processing Systems, 2005.
[30] C.K.I. Williams and C.E. Rasmussen, "Gaussian Processes for Regression," Proc. Neural Information Processing Systems, 1996.
[31] H.M. Wu, "Kernel Sliced Inverse Regression with Applications to Classification," J. Computational and Graphical Statistics, vol. 17, no. 3, pp. 590-610, 2008.
[32] E.P. Xing, A.Y. Ng, M.I. Jordan, and S. Russell, "Distance Metric Learning with Application to Clustering with Side-Information," Proc. Neural Information Processing Systems, 2002.
[33] X. Zhu, Z. Ghahramani, and J. Lafferty, "Semi-Supervised Learning Using Gaussian Fields and Harmonic Functions," Proc. Int'l Conf. Machine Learning, 2003.
34 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool