Pseudo-marginal Hamiltonian Monte Carlo

Bayesian inference in the presence of an intractable likelihood function is computationally challenging. When following a Markov chain Monte Carlo (MCMC) approach to approximate the posterior distribution in this context, one typically either uses MCMC schemes which target the joint posterior of the...

ver descrição completa

Detalhes bibliográficos
Main Authors: Alenlov, J, Doucet, A, Lindsten, F
Formato: Journal article
Idioma:English
Publicado em: Journal of Machine Learning Research 2021
_version_ 1826284663858003968
author Alenlov, J
Doucet, A
Lindsten, F
author_facet Alenlov, J
Doucet, A
Lindsten, F
author_sort Alenlov, J
collection OXFORD
description Bayesian inference in the presence of an intractable likelihood function is computationally challenging. When following a Markov chain Monte Carlo (MCMC) approach to approximate the posterior distribution in this context, one typically either uses MCMC schemes which target the joint posterior of the parameters and some auxiliary latent variables, or pseudo-marginal Metropolis—Hastings (MH) schemes. The latter mimic a MH algorithm targeting the marginal posterior of the parameters by approximating unbiasedly the intractable likelihood. However, in scenarios where the parameters and auxiliary variables are strongly correlated under the posterior and/or this posterior is multimodal, Gibbs sampling or Hamiltonian Monte Carlo (HMC) will perform poorly and the pseudo-marginal MH algorithm, as any other MH scheme, will be inefficient for high-dimensional parameters. We propose here an original MCMC algorithm, termed pseudo-marginal HMC, which combines the advantages of both HMC and pseudo-marginal schemes. Specifically, the PM-HMC method is controlled by a precision parameter N, controlling the approximation of the likelihood and, for any N, it samples the marginal posterior of the parameters. Additionally, as N tends to infinity, its sample trajectories and acceptance probability converge to those of an ideal, but intractable, HMC algorithm which would have access to the intractable likelihood and its gradient. We demonstrate through experiments that PM-HMC can outperform significantly both standard HMC and pseudo-marginal MH schemes.
first_indexed 2024-03-07T01:17:14Z
format Journal article
id oxford-uuid:8f1ca11c-3f56-4a15-8bbd-fae943c0a2dc
institution University of Oxford
language English
last_indexed 2024-03-07T01:17:14Z
publishDate 2021
publisher Journal of Machine Learning Research
record_format dspace
spelling oxford-uuid:8f1ca11c-3f56-4a15-8bbd-fae943c0a2dc2022-03-26T23:02:06ZPseudo-marginal Hamiltonian Monte CarloJournal articlehttp://purl.org/coar/resource_type/c_dcae04bcuuid:8f1ca11c-3f56-4a15-8bbd-fae943c0a2dcEnglishSymplectic ElementsJournal of Machine Learning Research2021Alenlov, JDoucet, ALindsten, FBayesian inference in the presence of an intractable likelihood function is computationally challenging. When following a Markov chain Monte Carlo (MCMC) approach to approximate the posterior distribution in this context, one typically either uses MCMC schemes which target the joint posterior of the parameters and some auxiliary latent variables, or pseudo-marginal Metropolis—Hastings (MH) schemes. The latter mimic a MH algorithm targeting the marginal posterior of the parameters by approximating unbiasedly the intractable likelihood. However, in scenarios where the parameters and auxiliary variables are strongly correlated under the posterior and/or this posterior is multimodal, Gibbs sampling or Hamiltonian Monte Carlo (HMC) will perform poorly and the pseudo-marginal MH algorithm, as any other MH scheme, will be inefficient for high-dimensional parameters. We propose here an original MCMC algorithm, termed pseudo-marginal HMC, which combines the advantages of both HMC and pseudo-marginal schemes. Specifically, the PM-HMC method is controlled by a precision parameter N, controlling the approximation of the likelihood and, for any N, it samples the marginal posterior of the parameters. Additionally, as N tends to infinity, its sample trajectories and acceptance probability converge to those of an ideal, but intractable, HMC algorithm which would have access to the intractable likelihood and its gradient. We demonstrate through experiments that PM-HMC can outperform significantly both standard HMC and pseudo-marginal MH schemes.
spellingShingle Alenlov, J
Doucet, A
Lindsten, F
Pseudo-marginal Hamiltonian Monte Carlo
title Pseudo-marginal Hamiltonian Monte Carlo
title_full Pseudo-marginal Hamiltonian Monte Carlo
title_fullStr Pseudo-marginal Hamiltonian Monte Carlo
title_full_unstemmed Pseudo-marginal Hamiltonian Monte Carlo
title_short Pseudo-marginal Hamiltonian Monte Carlo
title_sort pseudo marginal hamiltonian monte carlo
work_keys_str_mv AT alenlovj pseudomarginalhamiltonianmontecarlo
AT douceta pseudomarginalhamiltonianmontecarlo
AT lindstenf pseudomarginalhamiltonianmontecarlo