Parallel Local Approximation MCMC for Expensive Models
Performing Bayesian inference via Markov chain Monte Carlo (MCMC) can be exceedingly expensive when posterior evaluations invoke the evaluation of a computationally expensive model, such as a system of PDEs. In recent work [J. Amer. Statist. Assoc., 111 (2016), pp. 1591-1607] we described a framewor...
Main Authors: | , , , , |
---|---|
Other Authors: | |
Format: | Article |
Published: |
Society for Industrial & Applied Mathematics (SIAM)
2019
|
Online Access: | http://hdl.handle.net/1721.1/120851 https://orcid.org/0000-0002-6882-305X https://orcid.org/0000-0001-8242-3290 |
Summary: | Performing Bayesian inference via Markov chain Monte Carlo (MCMC) can be exceedingly expensive when posterior evaluations invoke the evaluation of a computationally expensive model, such as a system of PDEs. In recent work [J. Amer. Statist. Assoc., 111 (2016), pp. 1591-1607] we described a framework for constructing and refining local approximations of such models during an MCMC simulation. These posterior-adapted approximations harness regularity of the model to reduce the computational cost of inference while preserving asymptotic exactness of the Markov chain. Here we describe two extensions of that work. First, we prove that samplers running in parallel can collaboratively construct a shared posterior approximation while ensuring ergodicity of each associated chain, providing a novel opportunity for exploiting parallel computation in MCMC. Second, focusing on the Metropolis-adjusted Langevin algorithm, we describe how a proposal distribution can successfully employ gradients and other relevant information extracted from the approximation. We investigate the practical performance of our approach using two challenging inference problems, the first in subsurface hydrology and the second in glaciology. Using local approximations constructed via parallel chains, we successfully reduce the run time needed to characterize the posterior distributions in these problems from days to hours and from months to days, respectively, dramatically improving the tractability of Bayesian inference. |
---|