The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.02 - Feb. (2015 vol.37)
pp: 212-229
Pierpaolo De Blasi , Department of Economics and Statistics, University of Torino, Torino, Italy
Stefano Favaro , Department of Economics and Statistics, University of Torino, Torino, Italy
Antonio Lijoi , , Collegio Carlo Alberto, Moncalieri, Italy
Ramses H. Mena , Department of Probability and Statistics, Universidad Nacional Autónoma de México, México, México
Igor Prunster , Department of Economics and Statistics, University of Torino, Torino, Italy
Matteo Ruggiero , Department of Economics and Statistics, University of Torino, Torino, Italy
ABSTRACT
Discrete random probability measures and the exchangeable random partitions they induce are key tools for addressing a variety of estimation and prediction problems in Bayesian inference. Here we focus on the family of Gibbs–type priors, a recent elegant generalization of the Dirichlet and the Pitman–Yor process priors. These random probability measures share properties that are appealing both from a theoretical and an applied point of view: (i) they admit an intuitive predictive characterization justifying their use in terms of a precise assumption on the learning mechanism; (ii) they stand out in terms of mathematical tractability; (iii) they include several interesting special cases besides the Dirichlet and the Pitman–Yor processes. The goal of our paper is to provide a systematic and unified treatment of Gibbs–type priors and highlight their implications for Bayesian nonparametric inference. We deal with their distributional properties, the resulting estimators, frequentist asymptotic validation and the construction of time–dependent versions. Applications, mainly concerning mixture models and species sampling, serve to convey the main ideas. The intuition inherent to this class of priors and the neat results they lead to make one wonder whether it actually represents the most natural generalization of the Dirichlet process.
INDEX TERMS
Bayes methods, Educational institutions, Analytical models, Q measurement, Learning systems, Proposals, Computational modeling,Stochastic processes, Nonparametric statistics,species sampling, Bayesian nonparametrics, clustering, consistency, dependent process, discrete nonparametric prior, exchangeable partition probability function, Gibbs?type prior, Pitman?Yor process, mixture model, population genetics, predictive distribution
CITATION
Pierpaolo De Blasi, Stefano Favaro, Antonio Lijoi, Ramses H. Mena, Igor Prunster, Matteo Ruggiero, "Are Gibbs-Type Priors the Most Natural Generalization of the Dirichlet Process?", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.37, no. 2, pp. 212-229, Feb. 2015, doi:10.1109/TPAMI.2013.217
38 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool