PASS-GLM: Polynomial approximate sufficient statistics for scalable Bayesian GLM inference
Generalized linear models (GLMs) - such as logistic regression, Poisson regression, and robust regression - provide interpretable models for diverse data types. Probabilistic approaches, particularly Bayesian ones, allow coherent estimates of uncertainty, incorporation of prior information, and shar...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2020
|
Online Access: | https://hdl.handle.net/1721.1/128777 |
_version_ | 1811095718281084928 |
---|---|
author | Huggins, Jonathan H. Broderick, Tamara A |
author2 | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science |
author_facet | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Huggins, Jonathan H. Broderick, Tamara A |
author_sort | Huggins, Jonathan H. |
collection | MIT |
description | Generalized linear models (GLMs) - such as logistic regression, Poisson regression, and robust regression - provide interpretable models for diverse data types. Probabilistic approaches, particularly Bayesian ones, allow coherent estimates of uncertainty, incorporation of prior information, and sharing of power across experiments via hierarchical models. In practice, however, the approximate Bayesian methods necessary for inference have either failed to scale to large data sets or failed to provide theoretical guarantees on the quality of inference. We propose a new approach based on constructing polynomial approximate sufficient statistics for GLMs (PASS-GLM). We demonstrate that our method admits a simple algorithm as well as trivial streaming and distributed extensions that do not compound error across computations. We provide theoretical guarantees on the quality of point (MAP) estimates, the approximate posterior, and posterior mean and uncertainty estimates. We validate our approach empirically in the case of logistic regression using a quadratic approximation and show competitive performance with stochastic gradient descent, MCMC, and the Laplace approximation in terms of speed and multiple measures of accuracy - including on an advertising data set with 40 million data points and 20, 000 covariates. |
first_indexed | 2024-09-23T16:25:51Z |
format | Article |
id | mit-1721.1/128777 |
institution | Massachusetts Institute of Technology |
language | English |
last_indexed | 2024-09-23T16:25:51Z |
publishDate | 2020 |
record_format | dspace |
spelling | mit-1721.1/1287772022-09-29T19:49:42Z PASS-GLM: Polynomial approximate sufficient statistics for scalable Bayesian GLM inference Huggins, Jonathan H. Broderick, Tamara A Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory Generalized linear models (GLMs) - such as logistic regression, Poisson regression, and robust regression - provide interpretable models for diverse data types. Probabilistic approaches, particularly Bayesian ones, allow coherent estimates of uncertainty, incorporation of prior information, and sharing of power across experiments via hierarchical models. In practice, however, the approximate Bayesian methods necessary for inference have either failed to scale to large data sets or failed to provide theoretical guarantees on the quality of inference. We propose a new approach based on constructing polynomial approximate sufficient statistics for GLMs (PASS-GLM). We demonstrate that our method admits a simple algorithm as well as trivial streaming and distributed extensions that do not compound error across computations. We provide theoretical guarantees on the quality of point (MAP) estimates, the approximate posterior, and posterior mean and uncertainty estimates. We validate our approach empirically in the case of logistic regression using a quadratic approximation and show competitive performance with stochastic gradient descent, MCMC, and the Laplace approximation in terms of speed and multiple measures of accuracy - including on an advertising data set with 40 million data points and 20, 000 covariates. United States. Office of Naval Research (Grant N00014-17-1-2072) United States. Office of Naval Research. Multidisciplinary University Research Initiative (Grant N00014-11-1-0688) 2020-12-10T18:13:44Z 2020-12-10T18:13:44Z 2017-12 2020-12-03T17:53:56Z Article http://purl.org/eprint/type/ConferencePaper 1049-5258 https://hdl.handle.net/1721.1/128777 Huggins, Jonathan H., Ryan P. Adams and Tamara Broderick. “PASS-GLM: Polynomial approximate sufficient statistics for scalable Bayesian GLM inference.” Advances in Neural Information Processing Systems, 2017-December (December 2017) © 2017 The Author(s) en https://papers.nips.cc/paper/2017/hash/07811dc6c422334ce36a09ff5cd6fe71-Abstract.html Advances in Neural Information Processing Systems Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use. application/pdf Neural Information Processing Systems (NIPS) |
spellingShingle | Huggins, Jonathan H. Broderick, Tamara A PASS-GLM: Polynomial approximate sufficient statistics for scalable Bayesian GLM inference |
title | PASS-GLM: Polynomial approximate sufficient statistics for scalable Bayesian GLM inference |
title_full | PASS-GLM: Polynomial approximate sufficient statistics for scalable Bayesian GLM inference |
title_fullStr | PASS-GLM: Polynomial approximate sufficient statistics for scalable Bayesian GLM inference |
title_full_unstemmed | PASS-GLM: Polynomial approximate sufficient statistics for scalable Bayesian GLM inference |
title_short | PASS-GLM: Polynomial approximate sufficient statistics for scalable Bayesian GLM inference |
title_sort | pass glm polynomial approximate sufficient statistics for scalable bayesian glm inference |
url | https://hdl.handle.net/1721.1/128777 |
work_keys_str_mv | AT hugginsjonathanh passglmpolynomialapproximatesufficientstatisticsforscalablebayesianglminference AT brodericktamaraa passglmpolynomialapproximatesufficientstatisticsforscalablebayesianglminference |