Fast MCMC sampling for Markov jump processes and extensions

Markov jump processes (or continuous-time Markov chains) are a simple and important class of continuous-time dynamical systems. In this paper, we tackle the problem of simulating from the posterior distribution over paths in these models, given partial and noisy observations. Our approach is an auxi...

Full description

Bibliographic Details
Main Authors: Rao, V, Teh, Y
Format: Journal article
Published: Journal of Machine Learning Research 2013
_version_ 1797060499325583360
author Rao, V
Teh, Y
author_facet Rao, V
Teh, Y
author_sort Rao, V
collection OXFORD
description Markov jump processes (or continuous-time Markov chains) are a simple and important class of continuous-time dynamical systems. In this paper, we tackle the problem of simulating from the posterior distribution over paths in these models, given partial and noisy observations. Our approach is an auxiliary variable Gibbs sampler, and is based on the idea of uniformization. This sets up a Markov chain over paths by alternately sampling a finite set of virtual jump times given the current path, and then sampling a new path given the set of extant and virtual jump times. The first step involves simulating a piecewise-constant inhomogeneous Poisson process, while for the second, we use a standard hidden Markov model forward filtering-backward sampling algorithm. Our method is exact and does not involve approximations like time-discretization. We demonstrate how our sampler extends naturally to MJP-based models like Markov-modulated Poisson processes and continuous-time Bayesian networks, and show significant computational benefits over state-ofthe-art MCMC samplers for these models.
first_indexed 2024-03-06T20:18:00Z
format Journal article
id oxford-uuid:2cd273d2-c61e-4c2e-bbb5-69f94d9e1710
institution University of Oxford
last_indexed 2024-03-06T20:18:00Z
publishDate 2013
publisher Journal of Machine Learning Research
record_format dspace
spelling oxford-uuid:2cd273d2-c61e-4c2e-bbb5-69f94d9e17102022-03-26T12:39:12ZFast MCMC sampling for Markov jump processes and extensionsJournal articlehttp://purl.org/coar/resource_type/c_dcae04bcuuid:2cd273d2-c61e-4c2e-bbb5-69f94d9e1710Symplectic Elements at OxfordJournal of Machine Learning Research2013Rao, VTeh, YMarkov jump processes (or continuous-time Markov chains) are a simple and important class of continuous-time dynamical systems. In this paper, we tackle the problem of simulating from the posterior distribution over paths in these models, given partial and noisy observations. Our approach is an auxiliary variable Gibbs sampler, and is based on the idea of uniformization. This sets up a Markov chain over paths by alternately sampling a finite set of virtual jump times given the current path, and then sampling a new path given the set of extant and virtual jump times. The first step involves simulating a piecewise-constant inhomogeneous Poisson process, while for the second, we use a standard hidden Markov model forward filtering-backward sampling algorithm. Our method is exact and does not involve approximations like time-discretization. We demonstrate how our sampler extends naturally to MJP-based models like Markov-modulated Poisson processes and continuous-time Bayesian networks, and show significant computational benefits over state-ofthe-art MCMC samplers for these models.
spellingShingle Rao, V
Teh, Y
Fast MCMC sampling for Markov jump processes and extensions
title Fast MCMC sampling for Markov jump processes and extensions
title_full Fast MCMC sampling for Markov jump processes and extensions
title_fullStr Fast MCMC sampling for Markov jump processes and extensions
title_full_unstemmed Fast MCMC sampling for Markov jump processes and extensions
title_short Fast MCMC sampling for Markov jump processes and extensions
title_sort fast mcmc sampling for markov jump processes and extensions
work_keys_str_mv AT raov fastmcmcsamplingformarkovjumpprocessesandextensions
AT tehy fastmcmcsamplingformarkovjumpprocessesandextensions