Skip to main content

PASS-GLM: polynomial approximate sufficient statistics for scalable Bayesian GLM inference

Author(s): Huggins, Jonathan; Adams, Ryan P; Broderick, Tamara

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1qz62
Full metadata record
DC FieldValueLanguage
dc.contributor.authorHuggins, Jonathan-
dc.contributor.authorAdams, Ryan P-
dc.contributor.authorBroderick, Tamara-
dc.date.accessioned2021-10-08T19:45:43Z-
dc.date.available2021-10-08T19:45:43Z-
dc.date.issued2017en_US
dc.identifier.citationHuggins, Jonathan, Ryan P. Adams, and Tamara Broderick. "Pass-glm: polynomial approximate sufficient statistics for scalable bayesian glm inference." In Advances in Neural Information Processing Systems 30 (2017): pp. 3611-3621.en_US
dc.identifier.issn1049-5258-
dc.identifier.urihttps://papers.nips.cc/paper/2017/hash/07811dc6c422334ce36a09ff5cd6fe71-Abstract.html-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr1qz62-
dc.description.abstractGeneralized linear models (GLMs)—such as logistic regression, Poisson regression, and robust regression—provide interpretable models for diverse data types. Probabilistic approaches, particularly Bayesian ones, allow coherent estimates of uncertainty, incorporation of prior information, and sharing of power across experiments via hierarchical models. In practice, however, the approximate Bayesian methods necessary for inference have either failed to scale to large data sets or failed to provide theoretical guarantees on the quality of inference. We propose a new approach based on constructing polynomial approximate sufficient statistics for GLMs (PASS-GLM). We demonstrate that our method admits a simple algorithm as well as trivial streaming and distributed extensions that do not compound error across computations. We provide theoretical guarantees on the quality of point (MAP) estimates, the approximate posterior, and posterior mean and uncertainty estimates. We validate our approach empirically in the case of logistic regression using a quadratic approximation and show competitive performance with stochastic gradient descent, MCMC, and the Laplace approximation in terms of speed and multiple measures of accuracy—including on an advertising data set with 40 million data points and 20,000 covariates.en_US
dc.format.extent3611 - 3621en_US
dc.language.isoen_USen_US
dc.relation.ispartofAdvances in Neural Information Processing Systemsen_US
dc.rightsFinal published version. Article is made available in OAR by the publisher's permission or policy.en_US
dc.titlePASS-GLM: polynomial approximate sufficient statistics for scalable Bayesian GLM inferenceen_US
dc.typeConference Articleen_US
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/conference-proceedingen_US

Files in This Item:
File Description SizeFormat 
ApproximateStatisticsBayesianGLMInference.pdf627.54 kBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.