Distributed Bayesian learning with stochastic natural gradient expectation propagation and the posterior server

This paper makes two contributions to Bayesian machine learning algorithms. Firstly, we propose stochastic natural gradient expectation propagation (SNEP), a novel alternative to expectation propagation (EP), a popular variational inference algorithm. SNEP is a black box variational algorithm, in th...

Full description

Bibliographic Details
Main Authors: Hasenclver, L, Webb, S, Lienart, T, Vollmer, S, Lakshminarayanan, B, Blundell, C, Teh, Y
Format: Journal article
Published: Journal of Machine Learning Research 2017
_version_ 1797080436492468224
author Hasenclver, L
Webb, S
Lienart, T
Vollmer, S
Lakshminarayanan, B
Blundell, C
Teh, Y
author_facet Hasenclver, L
Webb, S
Lienart, T
Vollmer, S
Lakshminarayanan, B
Blundell, C
Teh, Y
author_sort Hasenclver, L
collection OXFORD
description This paper makes two contributions to Bayesian machine learning algorithms. Firstly, we propose stochastic natural gradient expectation propagation (SNEP), a novel alternative to expectation propagation (EP), a popular variational inference algorithm. SNEP is a black box variational algorithm, in that it does not require any simplifying assumptions on the distribution of interest, beyond the existence of some Monte Carlo sampler for estimating the moments of the EP tilted distributions. Further, as opposed to EP which has no guarantee of convergence, SNEP can be shown to be convergent, even when using Monte Carlo moment estimates. Secondly, we propose a novel architecture for distributed Bayesian learning which we call the posterior server. The posterior server allows scalable and robust Bayesian learning in cases where a data set is stored in a distributed manner across a cluster, with each compute node containing a disjoint subset of data. An independent Monte Carlo sampler is run on each compute node, with direct access only to the local data subset, but which targets an approximation to the global posterior distribution given all data across the whole cluster. This is achieved by using a distributed asynchronous implementation of SNEP to pass messages across the cluster. We demonstrate SNEP and the posterior server on distributed Bayesian learning of logistic regression and neural networks.
first_indexed 2024-03-07T01:00:01Z
format Journal article
id oxford-uuid:8963f732-32a9-4f37-9856-090390b0cec9
institution University of Oxford
last_indexed 2024-03-07T01:00:01Z
publishDate 2017
publisher Journal of Machine Learning Research
record_format dspace
spelling oxford-uuid:8963f732-32a9-4f37-9856-090390b0cec92022-03-26T22:24:16ZDistributed Bayesian learning with stochastic natural gradient expectation propagation and the posterior serverJournal articlehttp://purl.org/coar/resource_type/c_dcae04bcuuid:8963f732-32a9-4f37-9856-090390b0cec9Symplectic Elements at OxfordJournal of Machine Learning Research2017Hasenclver, LWebb, SLienart, TVollmer, SLakshminarayanan, BBlundell, CTeh, YThis paper makes two contributions to Bayesian machine learning algorithms. Firstly, we propose stochastic natural gradient expectation propagation (SNEP), a novel alternative to expectation propagation (EP), a popular variational inference algorithm. SNEP is a black box variational algorithm, in that it does not require any simplifying assumptions on the distribution of interest, beyond the existence of some Monte Carlo sampler for estimating the moments of the EP tilted distributions. Further, as opposed to EP which has no guarantee of convergence, SNEP can be shown to be convergent, even when using Monte Carlo moment estimates. Secondly, we propose a novel architecture for distributed Bayesian learning which we call the posterior server. The posterior server allows scalable and robust Bayesian learning in cases where a data set is stored in a distributed manner across a cluster, with each compute node containing a disjoint subset of data. An independent Monte Carlo sampler is run on each compute node, with direct access only to the local data subset, but which targets an approximation to the global posterior distribution given all data across the whole cluster. This is achieved by using a distributed asynchronous implementation of SNEP to pass messages across the cluster. We demonstrate SNEP and the posterior server on distributed Bayesian learning of logistic regression and neural networks.
spellingShingle Hasenclver, L
Webb, S
Lienart, T
Vollmer, S
Lakshminarayanan, B
Blundell, C
Teh, Y
Distributed Bayesian learning with stochastic natural gradient expectation propagation and the posterior server
title Distributed Bayesian learning with stochastic natural gradient expectation propagation and the posterior server
title_full Distributed Bayesian learning with stochastic natural gradient expectation propagation and the posterior server
title_fullStr Distributed Bayesian learning with stochastic natural gradient expectation propagation and the posterior server
title_full_unstemmed Distributed Bayesian learning with stochastic natural gradient expectation propagation and the posterior server
title_short Distributed Bayesian learning with stochastic natural gradient expectation propagation and the posterior server
title_sort distributed bayesian learning with stochastic natural gradient expectation propagation and the posterior server
work_keys_str_mv AT hasenclverl distributedbayesianlearningwithstochasticnaturalgradientexpectationpropagationandtheposteriorserver
AT webbs distributedbayesianlearningwithstochasticnaturalgradientexpectationpropagationandtheposteriorserver
AT lienartt distributedbayesianlearningwithstochasticnaturalgradientexpectationpropagationandtheposteriorserver
AT vollmers distributedbayesianlearningwithstochasticnaturalgradientexpectationpropagationandtheposteriorserver
AT lakshminarayananb distributedbayesianlearningwithstochasticnaturalgradientexpectationpropagationandtheposteriorserver
AT blundellc distributedbayesianlearningwithstochasticnaturalgradientexpectationpropagationandtheposteriorserver
AT tehy distributedbayesianlearningwithstochasticnaturalgradientexpectationpropagationandtheposteriorserver