Skip to main content

Stochastic Variational Inference

Author(s): Hoffman, Matthew D; Blei, David M; Wang, Chong; Paisley, John

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1zc0j
Abstract: We develop stochastic variational inference, a scalable algorithm for approximating posterior distributions. We develop this technique for a large class of probabilistic models and we demonstrate it with two probabilistic topic models, latent Dirichlet allocation and the hierarchical Dirichlet process topic model. Using stochastic variational inference, we analyze several large collections of documents: 300K articles from Nature, 1.8M articles from The New York Times, and 3.8M articles from Wikipedia. Stochastic inference can easily handle data sets of this size and outperforms traditional variational inference, which can only handle a smaller subset. (We also show that the Bayesian nonparametric topic model outperforms its parametric counterpart.) Stochastic variational inference lets us apply complex Bayesian models to massive data sets.
Publication Date: 2013
Citation: Hoffman, Matthew D., David M. Blei, Chong Wang, and John Paisley. "Stochastic Variational Inference." The Journal of Machine Learning Research 14, no. 4 (2013): 1303-1347.
ISSN: 1532-4435
1533-7928
Pages: 1303 - 1347
Type of Material: Journal Article
Journal/Proceeding Title: Journal of Machine Learning Research
Version: Final published version. Article is made available in OAR by the publisher's permission or policy.



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.