The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.12 - December (2006 vol.28)
pp: 1948-1959
ABSTRACT
Gaussian process classifiers (GPCs) are Bayesian probabilistic kernel classifiers. In GPCs, the probability of belonging to a certain class at an input location is monotonically related to the value of some latent function at that location. Starting from a Gaussian process prior over this latent function, data are used to infer both the posterior over the latent function and the values of hyperparameters to determine various aspects of the function. Recently, the expectation propagation (EP) approach has been proposed to infer the posterior over the latent function. Based on this work, we present an approximate EM algorithm, the EM-EP algorithm, to learn both the latent function and the hyperparameters. This algorithm is found to converge in practice and provides an efficient Bayesian framework for learning hyperparameters of the kernel. A multiclass extension of the EM-EP algorithm for GPCs is also derived. In the experimental results, the EM-EP algorithms are as good or better than other methods for GPCs or Support Vector Machines (SVMs) with cross-validation.
INDEX TERMS
Gaussian process classification, Bayesian methods, kernel methods, expectation propagation, EM-EP algorithm.
CITATION
Hyun-Chul Kim, Zoubin Ghahramani, "Bayesian Gaussian Process Classification with the EM-EP Algorithm", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.28, no. 12, pp. 1948-1959, December 2006, doi:10.1109/TPAMI.2006.238
16 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool