Skip to main content

Stochastic Variational Inference

Author(s): Hoffman, Matthew D; Blei, David M; Wang, Chong; Paisley, John

To refer to this page use:
Full metadata record
DC FieldValueLanguage
dc.contributor.authorHoffman, Matthew D-
dc.contributor.authorBlei, David M-
dc.contributor.authorWang, Chong-
dc.contributor.authorPaisley, John-
dc.identifier.citationHoffman, Matthew D., David M. Blei, Chong Wang, and John Paisley. "Stochastic Variational Inference." The Journal of Machine Learning Research 14, no. 4 (2013): 1303-1347.en_US
dc.description.abstractWe develop stochastic variational inference, a scalable algorithm for approximating posterior distributions. We develop this technique for a large class of probabilistic models and we demonstrate it with two probabilistic topic models, latent Dirichlet allocation and the hierarchical Dirichlet process topic model. Using stochastic variational inference, we analyze several large collections of documents: 300K articles from Nature, 1.8M articles from The New York Times, and 3.8M articles from Wikipedia. Stochastic inference can easily handle data sets of this size and outperforms traditional variational inference, which can only handle a smaller subset. (We also show that the Bayesian nonparametric topic model outperforms its parametric counterpart.) Stochastic variational inference lets us apply complex Bayesian models to massive data sets.en_US
dc.format.extent1303 - 1347en_US
dc.relation.ispartofJournal of Machine Learning Researchen_US
dc.rightsFinal published version. Article is made available in OAR by the publisher's permission or policy.en_US
dc.titleStochastic Variational Inferenceen_US
dc.typeJournal Articleen_US

Files in This Item:
File Description SizeFormat 
StochasticVariationalInference.pdf388.45 kBAdobe PDFView/Download

Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.