Streaming, distributed variational inference for Bayesian nonparametrics
This paper presents a methodology for creating streaming, distributed inference algorithms for Bayesian nonparametric (BNP) models. In the proposed framework, processing nodes receive a sequence of data minibatches, compute a variational posterior for each, and make asynchronous streaming updates to...
Main Authors: | , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | en_US |
Published: |
Neural Information Processing Systems Foundation
2016
|
Online Access: | http://hdl.handle.net/1721.1/106134 https://orcid.org/0000-0003-1499-0191 https://orcid.org/0000-0003-2339-1262 https://orcid.org/0000-0003-4844-3495 https://orcid.org/0000-0001-8576-1930 |
_version_ | 1811073391144206336 |
---|---|
author | Campbell, Trevor David Straub, Julian Fisher, John W How, Jonathan P |
author2 | Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory |
author_facet | Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory Campbell, Trevor David Straub, Julian Fisher, John W How, Jonathan P |
author_sort | Campbell, Trevor David |
collection | MIT |
description | This paper presents a methodology for creating streaming, distributed inference algorithms for Bayesian nonparametric (BNP) models. In the proposed framework, processing nodes receive a sequence of data minibatches, compute a variational posterior for each, and make asynchronous streaming updates to a central model. In contrast to previous algorithms, the proposed framework is truly streaming, distributed, asynchronous, learning-rate-free, and truncation-free. The key challenge in developing the framework, arising from fact that BNP models do not impose an inherent ordering on their components, is finding the correspondence between minibatch and central BNP posterior components before performing each update. To address this, the paper develops a combinatorial optimization problem over component correspondences, and provides an efficient solution technique. The paper concludes with an application of the methodology to the DP mixture model, with experimental results demonstrating its practical scalability and performance. |
first_indexed | 2024-09-23T09:32:21Z |
format | Article |
id | mit-1721.1/106134 |
institution | Massachusetts Institute of Technology |
language | en_US |
last_indexed | 2024-09-23T09:32:21Z |
publishDate | 2016 |
publisher | Neural Information Processing Systems Foundation |
record_format | dspace |
spelling | mit-1721.1/1061342022-09-30T15:08:58Z Streaming, distributed variational inference for Bayesian nonparametrics Campbell, Trevor David Straub, Julian Fisher, John W How, Jonathan P Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory Massachusetts Institute of Technology. Department of Aeronautics and Astronautics Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Campbell, Trevor David Straub, Julian Fisher, John W How, Jonathan P This paper presents a methodology for creating streaming, distributed inference algorithms for Bayesian nonparametric (BNP) models. In the proposed framework, processing nodes receive a sequence of data minibatches, compute a variational posterior for each, and make asynchronous streaming updates to a central model. In contrast to previous algorithms, the proposed framework is truly streaming, distributed, asynchronous, learning-rate-free, and truncation-free. The key challenge in developing the framework, arising from fact that BNP models do not impose an inherent ordering on their components, is finding the correspondence between minibatch and central BNP posterior components before performing each update. To address this, the paper develops a combinatorial optimization problem over component correspondences, and provides an efficient solution technique. The paper concludes with an application of the methodology to the DP mixture model, with experimental results demonstrating its practical scalability and performance. United States. Office of Naval Research. Multidisciplinary University Research Initiative (Grant N000141110688) 2016-12-22T21:23:37Z 2016-12-22T21:23:37Z 2015-12 Article http://purl.org/eprint/type/ConferencePaper 1049-5258 http://hdl.handle.net/1721.1/106134 Campell, Trevor et al. "Streaming, Distributed Variational Inference for Bayesian Nonparametrics" Advances in Neural Information Processing Systems (NIPS 2015). https://orcid.org/0000-0003-1499-0191 https://orcid.org/0000-0003-2339-1262 https://orcid.org/0000-0003-4844-3495 https://orcid.org/0000-0001-8576-1930 en_US https://papers.nips.cc/paper/5876-streaming-distributed-variational-inference-for-bayesian-nonparametrics Advances in Neural Information Processing Systems (NIPS 2015) Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use. application/pdf Neural Information Processing Systems Foundation NIPS |
spellingShingle | Campbell, Trevor David Straub, Julian Fisher, John W How, Jonathan P Streaming, distributed variational inference for Bayesian nonparametrics |
title | Streaming, distributed variational inference for Bayesian nonparametrics |
title_full | Streaming, distributed variational inference for Bayesian nonparametrics |
title_fullStr | Streaming, distributed variational inference for Bayesian nonparametrics |
title_full_unstemmed | Streaming, distributed variational inference for Bayesian nonparametrics |
title_short | Streaming, distributed variational inference for Bayesian nonparametrics |
title_sort | streaming distributed variational inference for bayesian nonparametrics |
url | http://hdl.handle.net/1721.1/106134 https://orcid.org/0000-0003-1499-0191 https://orcid.org/0000-0003-2339-1262 https://orcid.org/0000-0003-4844-3495 https://orcid.org/0000-0001-8576-1930 |
work_keys_str_mv | AT campbelltrevordavid streamingdistributedvariationalinferenceforbayesiannonparametrics AT straubjulian streamingdistributedvariationalinferenceforbayesiannonparametrics AT fisherjohnw streamingdistributedvariationalinferenceforbayesiannonparametrics AT howjonathanp streamingdistributedvariationalinferenceforbayesiannonparametrics |