Skip to main content

Variational Bayesian Inference with Stochastic Search

Author(s): Paisley, John; Blei, David M; Jordan, Michael I

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1j52z
Abstract: Mean-field variational inference is a method for approximate Bayesian posterior inference. It approximates a full posterior distribution with a factorized set of distributions by maximizing a lower bound on the marginal likelihood. This requires the ability to integrate a sum of terms in the log joint likelihood using this factorized distribution. Often not all integrals are in closed form, which is typically handled by using a lower bound. We present an alternative algorithm based on stochastic optimization that allows for direct optimization of the variational lower bound. This method uses control variates to reduce the variance of the stochastic search gradient, in which existing lower bounds can play an important role. We demonstrate the approach on two non-conjugate models: logistic regression and an approximation to the HDP.
Publication Date: Jun-2012
Citation: Paisley, John, David M. Blei, and Michael I. Jordan. "Variational Bayesian Inference with Stochastic Search." Proceedings of the 29th International Conference on Machine Learning (2012): pp. 1363–1370.
Pages: 1363–1370
Type of Material: Conference Article
Series/Report no.: ICML’12;
Journal/Proceeding Title: Proceedings of the 29th International Conference on Machine Learning
Version: Author's manuscript



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.