Issue No. 02 - Feb. (2013 vol. 35)

ISSN: 0162-8828

pp: 398-410

DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TPAMI.2012.96

F. Sánchez-Vega , Dept. of Appl. Math. & Stat., Johns Hopkins Univ., Baltimore, MD, USA

J. Eisner , Dept. of Comput. Sci., Johns Hopkins Univ., Baltimore, MD, USA

L. Younes , Dept. of Appl. Math. & Stat., Johns Hopkins Univ., Baltimore, MD, USA

D. Geman , Dept. of Appl. Math. & Stat., Johns Hopkins Univ., Baltimore, MD, USA

ABSTRACT

We present a new framework for learning high-dimensional multivariate probability distributions from estimated marginals. The approach is motivated by compositional models and Bayesian networks, and designed to adapt to small sample sizes. We start with a large, overlapping set of elementary statistical building blocks, or “primitives,” which are low-dimensional marginal distributions learned from data. Each variable may appear in many primitives. Subsets of primitives are combined in a Lego-like fashion to construct a probabilistic graphical model; only a small fraction of the primitives will participate in any valid construction. Since primitives can be precomputed, parameter estimation and structure search are separated. Model complexity is controlled by strong biases; we adapt the primitives to the amount of training data and impose rules which restrict the merging of them into allowable compositions. The likelihood of the data decomposes into a sum of local gains, one for each primitive in the final structure. We focus on a specific subclass of networks which are binary forests. Structure optimization corresponds to an integer linear program and the maximizing composition can be computed for reasonably large numbers of variables. Performance is evaluated using both synthetic data and real datasets from natural language processing and computational biology.

INDEX TERMS

Bayesian methods, Assembly, Computational modeling, Probability distribution, Object oriented modeling, Connectors, Joints,linear programming, Graphs and networks, statistical models, machine learning

CITATION

F. Sánchez-Vega, J. Eisner, L. Younes, D. Geman, "Learning Multivariate Distributions by Competitive Assembly of Marginals",

*IEEE Transactions on Pattern Analysis & Machine Intelligence*, vol. 35, no. , pp. 398-410, Feb. 2013, doi:10.1109/TPAMI.2012.96SEARCH