Issue No. 06 - June (2010 vol. 32)
Iulian Pruteanu-Malinici , Duke University, Durham
Lu Ren , Duke University, Durham
John Paisley , Duke University, Durham
Eric Wang , Duke University, Durham
Lawrence Carin , Duke University, Durham
We consider the problem of inferring and modeling topics in a sequence of documents with known publication dates. The documents at a given time are each characterized by a topic and the topics are drawn from a mixture model. The proposed model infers the change in the topic mixture weights as a function of time. The details of this general framework may take different forms, depending on the specifics of the model. For the examples considered here, we examine base measures based on independent multinomial-Dirichlet measures for representation of topic-dependent word counts. The form of the hierarchical model allows efficient variational Bayesian inference, of interest for large-scale problems. We demonstrate results and make comparisons to the model when the dynamic character is removed, and also compare to latent Dirichlet allocation (LDA) and Topics over Time (TOT). We consider a database of Neural Information Processing Systems papers as well as the US Presidential State of the Union addresses from 1790 to 2008.
Hierarchical models, variational Bayes, Dirichlet process, text modeling.
J. Paisley, I. Pruteanu-Malinici, L. Ren, L. Carin and E. Wang, "Hierarchical Bayesian Modeling of Topics in Time-Stamped Documents," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 32, no. , pp. 996-1011, 2009.