Skip to main content

Nonparametric variational inference

Author(s): Gershman, Samuel; Hoffman, Matt; Blei, David

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr16491
Abstract: Variational methods are widely used for approximate posterior inference. However, their use is typically limited to families of distributions that enjoy particular conjugacy properties. To circumvent this limitation, we propose a family of variational approximations inspired by nonparametric kernel density estimation. The locations of these kernels and their bandwidth are treated as variational parameters and optimized to improve an approximate lower bound on the marginal likelihood of the data. Using multiple kernels allows the approximation to capture multiple modes of the posterior, unlike most other variational approximations. We demonstrate the efficacy of the nonparametric approximation with a hierarchical logistic regression model and a nonlinear matrix factorization model. We obtain predictive performance as good as or better than more specialized variational methods and sample-based approximations. The method is easy to apply to more general graphical models for which standard variational methods are difficult to derive.
Publication Date: 2012
Citation: Gershman, S., Hoffman, M., & Blei, D. (2012). Nonparametric variational inference. In International Conference on Machine Learning.
Type of Material: Conference Article
Journal/Proceeding Title: International Conference on Machine Learning
Version: Final published version. This is an open access article.



Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.