Shrnutí: | Bayesian inference in the presence of an intractable likelihood function is computationally
challenging. When following a Markov chain Monte Carlo (MCMC) approach to approximate the posterior distribution in this context, one typically either uses MCMC schemes
which target the joint posterior of the parameters and some auxiliary latent variables, or
pseudo-marginal Metropolis—Hastings (MH) schemes. The latter mimic a MH algorithm
targeting the marginal posterior of the parameters by approximating unbiasedly the intractable likelihood. However, in scenarios where the parameters and auxiliary variables are
strongly correlated under the posterior and/or this posterior is multimodal, Gibbs sampling
or Hamiltonian Monte Carlo (HMC) will perform poorly and the pseudo-marginal MH
algorithm, as any other MH scheme, will be inefficient for high-dimensional parameters. We
propose here an original MCMC algorithm, termed pseudo-marginal HMC, which combines
the advantages of both HMC and pseudo-marginal schemes. Specifically, the PM-HMC
method is controlled by a precision parameter N, controlling the approximation of the
likelihood and, for any N, it samples the marginal posterior of the parameters. Additionally,
as N tends to infinity, its sample trajectories and acceptance probability converge to those
of an ideal, but intractable, HMC algorithm which would have access to the intractable
likelihood and its gradient. We demonstrate through experiments that PM-HMC can
outperform significantly both standard HMC and pseudo-marginal MH schemes.
|