# Variational Inference in Nonconjugate Models

## Author(s): Wang, Chong; Blei, David M

To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1dc1f
 Abstract: Mean-field variational methods are widely used for approximate posterior inference in many probabilistic models. In a typical application, mean-field methods approximately compute the posterior with a coordinate-ascent optimization algorithm. When the model is conditionally conjugate, the coordinate updates are easily derived and in closed form. However, many models of interest---like the correlated topic model and Bayesian logistic regression---are nonconjugate. In these models, mean-field methods cannot be directly applied and practitioners have had to develop variational algorithms on a case-by-case basis. In this paper, we develop two generic methods for nonconjugate models, Laplace variational inference and delta method variational inference. Our methods have several advantages: they allow for easily derived variational algorithms with a wide class of nonconjugate models; they extend and unify some of the existing algorithms that have been derived for specific models; and they work well on real-world data sets. We studied our methods on the correlated topic model, Bayesian logistic regression, and hierarchical Bayesian logistic regression. Publication Date: 2013 Citation: Wang, Chong, and David M. Blei. "Variational Inference in Nonconjugate Models." Journal of Machine Learning Research 14 (2013): pp. 1005-1031. ISSN: 1532-44351533-7928 Pages: 1005 - 1031 Type of Material: Journal Article Journal/Proceeding Title: Journal of Machine Learning Research Version: Final published version. Article is made available in OAR by the publisher's permission or policy.