Skip to main content

Reducing reparameterization gradient variance

Author(s): Miller, AC; Foti, NJ; D Amour, A; Adams, Ryan P

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1kb1s
Full metadata record
DC FieldValueLanguage
dc.contributor.authorMiller, AC-
dc.contributor.authorFoti, NJ-
dc.contributor.authorD Amour, A-
dc.contributor.authorAdams, Ryan P-
dc.date.accessioned2019-08-29T17:04:56Z-
dc.date.available2019-08-29T17:04:56Z-
dc.date.issued2017en_US
dc.identifier.citationMiller, AC, Foti, NJ, D Amour, A, Adams, RP. (2017). Reducing reparameterization gradient variance. 2017-December (3709 - 3719en_US
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr1kb1s-
dc.description.abstractOptimization with noisy gradients has become ubiquitous in statistics and machine learning. Reparameterization gradients, or gradient estimates computed via the "reparameterization trick," represent a class of noisy gradients often used in Monte Carlo variational inference (MCVI). However, when these gradient estimators are too noisy, the optimization procedure can be slow or fail to converge. One way to reduce noise is to generate more samples for the gradient estimate, but this can be computationally expensive. Instead, we view the noisy gradient as a random variable, and form an inexpensive approximation of the generating procedure for the gradient sample. This approximation has high correlation with the noisy gradient by construction, making it a useful control variate for variance reduction. We demonstrate our approach on a non-conjugate hierarchical model and a Bayesian neural net where our method attained orders of magnitude (20-2, 000×) reduction in gradient variance resulting in faster and more stable optimization.en_US
dc.format.extent3709 - 3719en_US
dc.language.isoen_USen_US
dc.relation.ispartofAdvances in Neural Information Processing Systemsen_US
dc.rightsAuthor's manuscripten_US
dc.titleReducing reparameterization gradient varianceen_US
dc.typeConference Articleen_US
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/conference-proceedingen_US

Files in This Item:
File Description SizeFormat 
Reducing reparameterization gradient variance.pdf656.8 kBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.