Extended Variational Message Passing for Automated Approximate Bayesian Inference
Variational Message Passing (VMP) provides an automatable and efficient algorithmic framework for approximating Bayesian inference in factorized probabilistic models that consist of conjugate exponential family distributions. The automation of Bayesian inference tasks is very important since many da...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-06-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/23/7/815 |
_version_ | 1797528683650482176 |
---|---|
author | Semih Akbayrak Ivan Bocharov Bert de Vries |
author_facet | Semih Akbayrak Ivan Bocharov Bert de Vries |
author_sort | Semih Akbayrak |
collection | DOAJ |
description | Variational Message Passing (VMP) provides an automatable and efficient algorithmic framework for approximating Bayesian inference in factorized probabilistic models that consist of conjugate exponential family distributions. The automation of Bayesian inference tasks is very important since many data processing problems can be formulated as inference tasks on a generative probabilistic model. However, accurate generative models may also contain deterministic and possibly nonlinear variable mappings and non-conjugate factor pairs that complicate the automatic execution of the VMP algorithm. In this paper, we show that executing VMP in complex models relies on the ability to compute the expectations of the statistics of hidden variables. We extend the applicability of VMP by approximating the required expectation quantities in appropriate cases by importance sampling and Laplace approximation. As a result, the proposed Extended VMP (EVMP) approach supports automated efficient inference for a very wide range of probabilistic model specifications. We implemented EVMP in the Julia language in the probabilistic programming package <i>ForneyLab.jl</i> and show by a number of examples that EVMP renders an almost universal inference engine for factorized probabilistic models. |
first_indexed | 2024-03-10T10:01:47Z |
format | Article |
id | doaj.art-9c31a773ddde4ca7859d81135f6607fb |
institution | Directory Open Access Journal |
issn | 1099-4300 |
language | English |
last_indexed | 2024-03-10T10:01:47Z |
publishDate | 2021-06-01 |
publisher | MDPI AG |
record_format | Article |
series | Entropy |
spelling | doaj.art-9c31a773ddde4ca7859d81135f6607fb2023-11-22T01:50:49ZengMDPI AGEntropy1099-43002021-06-0123781510.3390/e23070815Extended Variational Message Passing for Automated Approximate Bayesian InferenceSemih Akbayrak0Ivan Bocharov1Bert de Vries2Department of Electrical Engineering, Eindhoven University of Technology, P.O. Box 513, 5600MB Eindhoven, The NetherlandsDepartment of Electrical Engineering, Eindhoven University of Technology, P.O. Box 513, 5600MB Eindhoven, The NetherlandsDepartment of Electrical Engineering, Eindhoven University of Technology, P.O. Box 513, 5600MB Eindhoven, The NetherlandsVariational Message Passing (VMP) provides an automatable and efficient algorithmic framework for approximating Bayesian inference in factorized probabilistic models that consist of conjugate exponential family distributions. The automation of Bayesian inference tasks is very important since many data processing problems can be formulated as inference tasks on a generative probabilistic model. However, accurate generative models may also contain deterministic and possibly nonlinear variable mappings and non-conjugate factor pairs that complicate the automatic execution of the VMP algorithm. In this paper, we show that executing VMP in complex models relies on the ability to compute the expectations of the statistics of hidden variables. We extend the applicability of VMP by approximating the required expectation quantities in appropriate cases by importance sampling and Laplace approximation. As a result, the proposed Extended VMP (EVMP) approach supports automated efficient inference for a very wide range of probabilistic model specifications. We implemented EVMP in the Julia language in the probabilistic programming package <i>ForneyLab.jl</i> and show by a number of examples that EVMP renders an almost universal inference engine for factorized probabilistic models.https://www.mdpi.com/1099-4300/23/7/815Bayesian inferencevariational inferencefactor graphsvariational message passingprobabilistic programming |
spellingShingle | Semih Akbayrak Ivan Bocharov Bert de Vries Extended Variational Message Passing for Automated Approximate Bayesian Inference Entropy Bayesian inference variational inference factor graphs variational message passing probabilistic programming |
title | Extended Variational Message Passing for Automated Approximate Bayesian Inference |
title_full | Extended Variational Message Passing for Automated Approximate Bayesian Inference |
title_fullStr | Extended Variational Message Passing for Automated Approximate Bayesian Inference |
title_full_unstemmed | Extended Variational Message Passing for Automated Approximate Bayesian Inference |
title_short | Extended Variational Message Passing for Automated Approximate Bayesian Inference |
title_sort | extended variational message passing for automated approximate bayesian inference |
topic | Bayesian inference variational inference factor graphs variational message passing probabilistic programming |
url | https://www.mdpi.com/1099-4300/23/7/815 |
work_keys_str_mv | AT semihakbayrak extendedvariationalmessagepassingforautomatedapproximatebayesianinference AT ivanbocharov extendedvariationalmessagepassingforautomatedapproximatebayesianinference AT bertdevries extendedvariationalmessagepassingforautomatedapproximatebayesianinference |