Inference on Markov random fields: methods and applications

<p>This thesis considers the problem of performing inference on undirected graphical models with continuous state spaces. These models represent conditional independence structures that can appear in the context of Bayesian Machine Learning. In the thesis, we focus on computational methods and...

Olles dieđut

Bibliográfalaš dieđut
Váldodahkki: Lienart, T
Eará dahkkit: Doucet, A
Materiálatiipa: Oahppočájánas
Giella:English
Almmustuhtton: 2017
Fáttát:
_version_ 1826316208254746624
author Lienart, T
author2 Doucet, A
author_facet Doucet, A
Lienart, T
author_sort Lienart, T
collection OXFORD
description <p>This thesis considers the problem of performing inference on undirected graphical models with continuous state spaces. These models represent conditional independence structures that can appear in the context of Bayesian Machine Learning. In the thesis, we focus on computational methods and applications. The aim of the thesis is to demonstrate that the factorisation structure corresponding to the conditional independence structure present in high-dimensional models can be exploited to decrease the computational complexity of inference algorithms. First, we consider the smoothing problem on Hidden Markov Models (HMMs) and discuss novel algorithms that have sub-quadratic computational complexity in the number of particles used. We show they perform on par with existing state-of-the-art algorithms with a quadratic complexity. Further, a novel class of rejection free samplers for graphical models known as the Local Bouncy Particle Sampler (LBPS) is explored and applied on a very large instance of the Probabilistic Matrix Factorisation (PMF) problem. We show the method performs slightly better than Hamiltonian Monte Carlo methods (HMC). It is also the first such practical application of the method to a statistical model with hundreds of thousands of dimensions. In a second part of the thesis, we consider approximate Bayesian inference methods and in particular the Expectation Propagation (EP) algorithm. We show it can be applied as the backbone of a novel distributed Bayesian inference mechanism. Further, we discuss novel variants of the EP algorithms and show that a specific type of update mechanism, analogous to the mirror descent algorithm outperforms all existing variants and is robust to Monte Carlo noise. Lastly, we show that EP can be used to help the Particle Belief Propagation (PBP) algorithm in order to form cheap and adaptive proposals and significantly outperform classical PBP.</p>
first_indexed 2024-03-06T20:29:30Z
format Thesis
id oxford-uuid:3095b14c-98fb-4bda-affc-a1fa1708f628
institution University of Oxford
language English
last_indexed 2024-12-09T03:40:42Z
publishDate 2017
record_format dspace
spelling oxford-uuid:3095b14c-98fb-4bda-affc-a1fa1708f6282024-12-07T11:40:38ZInference on Markov random fields: methods and applicationsThesishttp://purl.org/coar/resource_type/c_db06uuid:3095b14c-98fb-4bda-affc-a1fa1708f628Computational statisticsMachine learningBayesian statisticsEnglishORA Deposit2017Lienart, TDoucet, ACaron, FTeh, YTurner, R<p>This thesis considers the problem of performing inference on undirected graphical models with continuous state spaces. These models represent conditional independence structures that can appear in the context of Bayesian Machine Learning. In the thesis, we focus on computational methods and applications. The aim of the thesis is to demonstrate that the factorisation structure corresponding to the conditional independence structure present in high-dimensional models can be exploited to decrease the computational complexity of inference algorithms. First, we consider the smoothing problem on Hidden Markov Models (HMMs) and discuss novel algorithms that have sub-quadratic computational complexity in the number of particles used. We show they perform on par with existing state-of-the-art algorithms with a quadratic complexity. Further, a novel class of rejection free samplers for graphical models known as the Local Bouncy Particle Sampler (LBPS) is explored and applied on a very large instance of the Probabilistic Matrix Factorisation (PMF) problem. We show the method performs slightly better than Hamiltonian Monte Carlo methods (HMC). It is also the first such practical application of the method to a statistical model with hundreds of thousands of dimensions. In a second part of the thesis, we consider approximate Bayesian inference methods and in particular the Expectation Propagation (EP) algorithm. We show it can be applied as the backbone of a novel distributed Bayesian inference mechanism. Further, we discuss novel variants of the EP algorithms and show that a specific type of update mechanism, analogous to the mirror descent algorithm outperforms all existing variants and is robust to Monte Carlo noise. Lastly, we show that EP can be used to help the Particle Belief Propagation (PBP) algorithm in order to form cheap and adaptive proposals and significantly outperform classical PBP.</p>
spellingShingle Computational statistics
Machine learning
Bayesian statistics
Lienart, T
Inference on Markov random fields: methods and applications
title Inference on Markov random fields: methods and applications
title_full Inference on Markov random fields: methods and applications
title_fullStr Inference on Markov random fields: methods and applications
title_full_unstemmed Inference on Markov random fields: methods and applications
title_short Inference on Markov random fields: methods and applications
title_sort inference on markov random fields methods and applications
topic Computational statistics
Machine learning
Bayesian statistics
work_keys_str_mv AT lienartt inferenceonmarkovrandomfieldsmethodsandapplications