${\cal O}(N^{3})$ runtime for data size $N$, making it intractable for large $N$ . Many algorithms for improving GP scaling approximate the covariance with lower rank matrices. Other work has exploited structure inherent in particular covariance functions, including GPs with implied Markov structure, and inputs on a lattice (both enable ${\cal O}(N)$ or ${\cal O}(N\; \log\; N)$ runtime). However, these GP advances have not been well extended to the multidimensional input setting, despite the preponderance of multidimensional applications. This paper introduces and tests three novel extensions of structured GPs to multidimensional inputs, for models with additive and multiplicative kernels. First we present a new method for inference in additive GPs, showing a novel connection between the classic backfitting method and the Bayesian framework. We extend this model using two advances: a variant of projection pursuit regression, and a Laplace approximation for non-Gaussian observations. Lastly, for multiplicative kernel structure, we present a novel method for GPs with inputs on a multidimensional grid. We illustrate the power of these three advances on several data sets, achieving performance equal to or very close to the naive GP at orders of magnitude less cost." /> ${\cal O}(N^{3})$ runtime for data size $N$, making it intractable for large $N$ . Many algorithms for improving GP scaling approximate the covariance with lower rank matrices. Other work has exploited structure inherent in particular covariance functions, including GPs with implied Markov structure, and inputs on a lattice (both enable ${\cal O}(N)$ or ${\cal O}(N\; \log\; N)$ runtime). However, these GP advances have not been well extended to the multidimensional input setting, despite the preponderance of multidimensional applications. This paper introduces and tests three novel extensions of structured GPs to multidimensional inputs, for models with additive and multiplicative kernels. First we present a new method for inference in additive GPs, showing a novel connection between the classic backfitting method and the Bayesian framework. We extend this model using two advances: a variant of projection pursuit regression, and a Laplace approximation for non-Gaussian observations. Lastly, for multiplicative kernel structure, we present a novel method for GPs with inputs on a multidimensional grid. We illustrate the power of these three advances on several data sets, achieving performance equal to or very close to the naive GP at orders of magnitude less cost." /> ${\cal O}(N^{3})$ runtime for data size $N$, making it intractable for large $N$ . Many algorithms for improving GP scaling approximate the covariance with lower rank matrices. Other work has exploited structure inherent in particular covariance functions, including GPs with implied Markov structure, and inputs on a lattice (both enable ${\cal O}(N)$ or ${\cal O}(N\; \log\; N)$ runtime). However, these GP advances have not been well extended to the multidimensional input setting, despite the preponderance of multidimensional applications. This paper introduces and tests three novel extensions of structured GPs to multidimensional inputs, for models with additive and multiplicative kernels. First we present a new method for inference in additive GPs, showing a novel connection between the classic backfitting method and the Bayesian framework. We extend this model using two advances: a variant of projection pursuit regression, and a Laplace approximation for non-Gaussian observations. Lastly, for multiplicative kernel structure, we present a novel method for GPs with inputs on a multidimensional grid. We illustrate the power of these three advances on several data sets, achieving performance equal to or very close to the naive GP at orders of magnitude less cost." /> Scaling Multidimensional Inference for Structured Gaussian Processes
The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.02 - Feb. (2015 vol.37)
pp: 424-436
Elad Gilboa , Preston M. Green Department of Electrical and System Engineering, Washington University in St. Louis, 14049 Agusta Dr., Chesterfield,
Yunus Saatci , Department of Engineering, University of Cambridge, 47 Consort Avenue, CB2 9AE, Cambridge CB2 9AE, Cambridgeshire , United Kingdom
John P. Cunningham , Department of Statistics, Columbia University, Room 1026 SSW, MC 4690, 1255 Amsterdam Ave, New York,
ABSTRACT
Exact Gaussian process (GP) regression has ${\cal O}(N^{3})$ runtime for data size $N$, making it intractable for large $N$ . Many algorithms for improving GP scaling approximate the covariance with lower rank matrices. Other work has exploited structure inherent in particular covariance functions, including GPs with implied Markov structure, and inputs on a lattice (both enable ${\cal O}(N)$ or ${\cal O}(N\; \log\; N)$ runtime). However, these GP advances have not been well extended to the multidimensional input setting, despite the preponderance of multidimensional applications. This paper introduces and tests three novel extensions of structured GPs to multidimensional inputs, for models with additive and multiplicative kernels. First we present a new method for inference in additive GPs, showing a novel connection between the classic backfitting method and the Bayesian framework. We extend this model using two advances: a variant of projection pursuit regression, and a Laplace approximation for non-Gaussian observations. Lastly, for multiplicative kernel structure, we present a novel method for GPs with inputs on a multidimensional grid. We illustrate the power of these three advances on several data sets, achieving performance equal to or very close to the naive GP at orders of magnitude less cost.
INDEX TERMS
Kernel, Additives, Approximation methods, Gaussian processes, Markov processes, Vectors, Runtime,Kronecker matrices, Gaussian processes, backfitting, projection-pursuit regression
CITATION
Elad Gilboa, Yunus Saatci, John P. Cunningham, "Scaling Multidimensional Inference for Structured Gaussian Processes", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.37, no. 2, pp. 424-436, Feb. 2015, doi:10.1109/TPAMI.2013.192
55 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool