The Connection between Bayesian Inference and Information Theory for Model Selection, Information Gain and Experimental Design
We show a link between Bayesian inference and information theory that is useful for model selection, assessment of information entropy and experimental design. We align Bayesian model evidence (BME) with relative entropy and cross entropy in order to simplify computations using prior-based (Monte Ca...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2019-11-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/21/11/1081 |
_version_ | 1811278835137642496 |
---|---|
author | Sergey Oladyshkin Wolfgang Nowak |
author_facet | Sergey Oladyshkin Wolfgang Nowak |
author_sort | Sergey Oladyshkin |
collection | DOAJ |
description | We show a link between Bayesian inference and information theory that is useful for model selection, assessment of information entropy and experimental design. We align Bayesian model evidence (BME) with relative entropy and cross entropy in order to simplify computations using prior-based (Monte Carlo) or posterior-based (Markov chain Monte Carlo) BME estimates. On the one hand, we demonstrate how Bayesian model selection can profit from information theory to estimate BME values via posterior-based techniques. Hence, we use various assumptions including relations to several information criteria. On the other hand, we demonstrate how relative entropy can profit from BME to assess information entropy during Bayesian updating and to assess utility in Bayesian experimental design. Specifically, we emphasize that relative entropy can be computed avoiding unnecessary multidimensional integration from both prior and posterior-based sampling techniques. Prior-based computation does not require any assumptions, however posterior-based estimates require at least one assumption. We illustrate the performance of the discussed estimates of BME, information entropy and experiment utility using a transparent, non-linear example. The multivariate Gaussian posterior estimate includes least assumptions and shows the best performance for BME estimation, information entropy and experiment utility from posterior-based sampling. |
first_indexed | 2024-04-13T00:42:52Z |
format | Article |
id | doaj.art-0d2864c75eee4175b16c3de3a31d7cb0 |
institution | Directory Open Access Journal |
issn | 1099-4300 |
language | English |
last_indexed | 2024-04-13T00:42:52Z |
publishDate | 2019-11-01 |
publisher | MDPI AG |
record_format | Article |
series | Entropy |
spelling | doaj.art-0d2864c75eee4175b16c3de3a31d7cb02022-12-22T03:10:06ZengMDPI AGEntropy1099-43002019-11-012111108110.3390/e21111081e21111081The Connection between Bayesian Inference and Information Theory for Model Selection, Information Gain and Experimental DesignSergey Oladyshkin0Wolfgang Nowak1Department of Stochastic Simulation and Safety Research for Hydrosystems, Institute for Modelling Hydraulic and Environmental Systems/SC SimTech, University of Stuttgart, Pfaffenwaldring 5a, 70569 Stuttgart, GermanyDepartment of Stochastic Simulation and Safety Research for Hydrosystems, Institute for Modelling Hydraulic and Environmental Systems/SC SimTech, University of Stuttgart, Pfaffenwaldring 5a, 70569 Stuttgart, GermanyWe show a link between Bayesian inference and information theory that is useful for model selection, assessment of information entropy and experimental design. We align Bayesian model evidence (BME) with relative entropy and cross entropy in order to simplify computations using prior-based (Monte Carlo) or posterior-based (Markov chain Monte Carlo) BME estimates. On the one hand, we demonstrate how Bayesian model selection can profit from information theory to estimate BME values via posterior-based techniques. Hence, we use various assumptions including relations to several information criteria. On the other hand, we demonstrate how relative entropy can profit from BME to assess information entropy during Bayesian updating and to assess utility in Bayesian experimental design. Specifically, we emphasize that relative entropy can be computed avoiding unnecessary multidimensional integration from both prior and posterior-based sampling techniques. Prior-based computation does not require any assumptions, however posterior-based estimates require at least one assumption. We illustrate the performance of the discussed estimates of BME, information entropy and experiment utility using a transparent, non-linear example. The multivariate Gaussian posterior estimate includes least assumptions and shows the best performance for BME estimation, information entropy and experiment utility from posterior-based sampling.https://www.mdpi.com/1099-4300/21/11/1081model evidence, entropy, model selection, information entropy, bayesian experimental design, kullback–leibler divergence, markov chain monte carlo, monte carlo |
spellingShingle | Sergey Oladyshkin Wolfgang Nowak The Connection between Bayesian Inference and Information Theory for Model Selection, Information Gain and Experimental Design Entropy model evidence, entropy, model selection, information entropy, bayesian experimental design, kullback–leibler divergence, markov chain monte carlo, monte carlo |
title | The Connection between Bayesian Inference and Information Theory for Model Selection, Information Gain and Experimental Design |
title_full | The Connection between Bayesian Inference and Information Theory for Model Selection, Information Gain and Experimental Design |
title_fullStr | The Connection between Bayesian Inference and Information Theory for Model Selection, Information Gain and Experimental Design |
title_full_unstemmed | The Connection between Bayesian Inference and Information Theory for Model Selection, Information Gain and Experimental Design |
title_short | The Connection between Bayesian Inference and Information Theory for Model Selection, Information Gain and Experimental Design |
title_sort | connection between bayesian inference and information theory for model selection information gain and experimental design |
topic | model evidence, entropy, model selection, information entropy, bayesian experimental design, kullback–leibler divergence, markov chain monte carlo, monte carlo |
url | https://www.mdpi.com/1099-4300/21/11/1081 |
work_keys_str_mv | AT sergeyoladyshkin theconnectionbetweenbayesianinferenceandinformationtheoryformodelselectioninformationgainandexperimentaldesign AT wolfgangnowak theconnectionbetweenbayesianinferenceandinformationtheoryformodelselectioninformationgainandexperimentaldesign AT sergeyoladyshkin connectionbetweenbayesianinferenceandinformationtheoryformodelselectioninformationgainandexperimentaldesign AT wolfgangnowak connectionbetweenbayesianinferenceandinformationtheoryformodelselectioninformationgainandexperimentaldesign |