Accelerating Asymptotically Exact MCMC for Computationally Intensive Models via Local Approximations

We construct a new framework for accelerating Markov chain Monte Carlo in posterior sampling problems where standard methods are limited by the computational cost of the likelihood, or of numerical models embedded therein. Our approach introduces local approximations of these models into the Metropo...

Full description

Bibliographic Details
Main Authors: Conrad, Patrick R., Marzouk, Youssef M., Pillai, Natesh S., Smith, Aaron
Other Authors: Massachusetts Institute of Technology. Department of Aeronautics and Astronautics
Format: Article
Language:en_US
Published: American Statistical Association 2015
Online Access:http://hdl.handle.net/1721.1/99937
https://orcid.org/0000-0001-8242-3290
_version_ 1811086697461448704
author Conrad, Patrick R.
Marzouk, Youssef M.
Pillai, Natesh S.
Smith, Aaron
author2 Massachusetts Institute of Technology. Department of Aeronautics and Astronautics
author_facet Massachusetts Institute of Technology. Department of Aeronautics and Astronautics
Conrad, Patrick R.
Marzouk, Youssef M.
Pillai, Natesh S.
Smith, Aaron
author_sort Conrad, Patrick R.
collection MIT
description We construct a new framework for accelerating Markov chain Monte Carlo in posterior sampling problems where standard methods are limited by the computational cost of the likelihood, or of numerical models embedded therein. Our approach introduces local approximations of these models into the Metropolis-Hastings kernel, borrowing ideas from deterministic approximation theory, optimization, and experimental design. Previous efforts at integrating approximate models into inference typically sacrifice either the sampler’s exactness or efficiency; our work seeks to address these limitations by exploiting useful convergence characteristics of local approximations. We prove the ergodicity of our approximate Markov chain, showing that it samples asymptotically from the exact posterior distribution of interest. We describe variations of the algorithm that employ either local polynomial approximations or local Gaussian process regressors. Our theoretical results reinforce the key observation underlying this paper: when the likelihood has some local regularity, the number of model evaluations per MCMC step can be greatly reduced without biasing the Monte Carlo average. Numerical experiments demonstrate multiple order-of-magnitude reductions in the number of forward model evaluations used in representative ODE and PDE inference problems, with both synthetic and real data.
first_indexed 2024-09-23T13:30:58Z
format Article
id mit-1721.1/99937
institution Massachusetts Institute of Technology
language en_US
last_indexed 2024-09-23T13:30:58Z
publishDate 2015
publisher American Statistical Association
record_format dspace
spelling mit-1721.1/999372022-09-28T14:35:09Z Accelerating Asymptotically Exact MCMC for Computationally Intensive Models via Local Approximations Conrad, Patrick R. Marzouk, Youssef M. Pillai, Natesh S. Smith, Aaron Massachusetts Institute of Technology. Department of Aeronautics and Astronautics Marzouk, Youssef M. Conrad, Patrick R. Marzouk, Youssef M. We construct a new framework for accelerating Markov chain Monte Carlo in posterior sampling problems where standard methods are limited by the computational cost of the likelihood, or of numerical models embedded therein. Our approach introduces local approximations of these models into the Metropolis-Hastings kernel, borrowing ideas from deterministic approximation theory, optimization, and experimental design. Previous efforts at integrating approximate models into inference typically sacrifice either the sampler’s exactness or efficiency; our work seeks to address these limitations by exploiting useful convergence characteristics of local approximations. We prove the ergodicity of our approximate Markov chain, showing that it samples asymptotically from the exact posterior distribution of interest. We describe variations of the algorithm that employ either local polynomial approximations or local Gaussian process regressors. Our theoretical results reinforce the key observation underlying this paper: when the likelihood has some local regularity, the number of model evaluations per MCMC step can be greatly reduced without biasing the Monte Carlo average. Numerical experiments demonstrate multiple order-of-magnitude reductions in the number of forward model evaluations used in representative ODE and PDE inference problems, with both synthetic and real data. United States. Dept. of Energy. Office of Advanced Scientific Computing Research. Scientific Discovery through Advanced Computing Program (Award DE-SC0007099) 2015-11-20T12:41:58Z 2015-11-20T12:41:58Z 2015-10 2014-09 Article http://purl.org/eprint/type/JournalArticle 0162-1459 1537-274X http://hdl.handle.net/1721.1/99937 Conrad, Patrick R., Youssef M. Marzouk, Natesh S. Pillai, and Aaron Smith. “Accelerating Asymptotically Exact MCMC for Computationally Intensive Models via Local Approximations.” Journal of the American Statistical Association (October 21, 2015): 00–00. https://orcid.org/0000-0001-8242-3290 en_US http://dx.doi.org/10.1080/01621459.2015.1096787 Journal of the American Statistical Association Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/ application/pdf American Statistical Association arXiv
spellingShingle Conrad, Patrick R.
Marzouk, Youssef M.
Pillai, Natesh S.
Smith, Aaron
Accelerating Asymptotically Exact MCMC for Computationally Intensive Models via Local Approximations
title Accelerating Asymptotically Exact MCMC for Computationally Intensive Models via Local Approximations
title_full Accelerating Asymptotically Exact MCMC for Computationally Intensive Models via Local Approximations
title_fullStr Accelerating Asymptotically Exact MCMC for Computationally Intensive Models via Local Approximations
title_full_unstemmed Accelerating Asymptotically Exact MCMC for Computationally Intensive Models via Local Approximations
title_short Accelerating Asymptotically Exact MCMC for Computationally Intensive Models via Local Approximations
title_sort accelerating asymptotically exact mcmc for computationally intensive models via local approximations
url http://hdl.handle.net/1721.1/99937
https://orcid.org/0000-0001-8242-3290
work_keys_str_mv AT conradpatrickr acceleratingasymptoticallyexactmcmcforcomputationallyintensivemodelsvialocalapproximations
AT marzoukyoussefm acceleratingasymptoticallyexactmcmcforcomputationallyintensivemodelsvialocalapproximations
AT pillainateshs acceleratingasymptoticallyexactmcmcforcomputationallyintensivemodelsvialocalapproximations
AT smithaaron acceleratingasymptoticallyexactmcmcforcomputationallyintensivemodelsvialocalapproximations