Distributed Bayesian learning with stochastic natural gradient expectation propagation and the posterior server
This paper makes two contributions to Bayesian machine learning algorithms. Firstly, we propose stochastic natural gradient expectation propagation (SNEP), a novel alternative to expectation propagation (EP), a popular variational inference algorithm. SNEP is a black box variational algorithm, in th...
Main Authors: | Hasenclver, L, Webb, S, Lienart, T, Vollmer, S, Lakshminarayanan, B, Blundell, C, Teh, Y |
---|---|
Format: | Journal article |
Published: |
Journal of Machine Learning Research
2017
|
Similar Items
-
Bayesian learning via stochastic gradient langevin dynamics
by: Welling, M, et al.
Published: (2011) -
Consistency and fluctuations for stochastic gradient Langevin dynamics
by: Teh, YW, et al.
Published: (2016) -
Exploration of the (non-)asymptotic bias and variance of stochastic gradient Langevin dynamics
by: Vollmer, S, et al.
Published: (2016) -
A unified stochastic gradient approach to designing Bayesian-optimal experiments
by: Foster, A, et al.
Published: (2020) -
Posterior consistency for Bayesian inverse problems through stability and regression results
by: Vollmer, S
Published: (2013)