Computing Entropies with Nested Sampling
The Shannon entropy, and related quantities such as mutual information, can be used to quantify uncertainty and relevance. However, in practice, it can be difficult to compute these quantities for arbitrary probability distributions, particularly if the probability mass functions or densities cannot...
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2017-08-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/19/8/422 |
_version_ | 1817996184055185408 |
---|---|
author | Brendon J. Brewer |
author_facet | Brendon J. Brewer |
author_sort | Brendon J. Brewer |
collection | DOAJ |
description | The Shannon entropy, and related quantities such as mutual information, can be used to quantify uncertainty and relevance. However, in practice, it can be difficult to compute these quantities for arbitrary probability distributions, particularly if the probability mass functions or densities cannot be evaluated. This paper introduces a computational approach, based on Nested Sampling, to evaluate entropies of probability distributions that can only be sampled. I demonstrate the method on three examples: a simple Gaussian example where the key quantities are available analytically; (ii) an experimental design example about scheduling observations in order to measure the period of an oscillating signal; and (iii) predicting the future from the past in a heavy-tailed scenario. |
first_indexed | 2024-04-14T02:18:25Z |
format | Article |
id | doaj.art-1964ebe184414250b93a86c3b769460a |
institution | Directory Open Access Journal |
issn | 1099-4300 |
language | English |
last_indexed | 2024-04-14T02:18:25Z |
publishDate | 2017-08-01 |
publisher | MDPI AG |
record_format | Article |
series | Entropy |
spelling | doaj.art-1964ebe184414250b93a86c3b769460a2022-12-22T02:18:06ZengMDPI AGEntropy1099-43002017-08-0119842210.3390/e19080422e19080422Computing Entropies with Nested SamplingBrendon J. Brewer0Department of Statistics, The University of Auckland, Auckland 1142, New ZealandThe Shannon entropy, and related quantities such as mutual information, can be used to quantify uncertainty and relevance. However, in practice, it can be difficult to compute these quantities for arbitrary probability distributions, particularly if the probability mass functions or densities cannot be evaluated. This paper introduces a computational approach, based on Nested Sampling, to evaluate entropies of probability distributions that can only be sampled. I demonstrate the method on three examples: a simple Gaussian example where the key quantities are available analytically; (ii) an experimental design example about scheduling observations in order to measure the period of an oscillating signal; and (iii) predicting the future from the past in a heavy-tailed scenario.https://www.mdpi.com/1099-4300/19/8/422information theoryentropymutual informationMonte Carlonested samplingBayesian inference |
spellingShingle | Brendon J. Brewer Computing Entropies with Nested Sampling Entropy information theory entropy mutual information Monte Carlo nested sampling Bayesian inference |
title | Computing Entropies with Nested Sampling |
title_full | Computing Entropies with Nested Sampling |
title_fullStr | Computing Entropies with Nested Sampling |
title_full_unstemmed | Computing Entropies with Nested Sampling |
title_short | Computing Entropies with Nested Sampling |
title_sort | computing entropies with nested sampling |
topic | information theory entropy mutual information Monte Carlo nested sampling Bayesian inference |
url | https://www.mdpi.com/1099-4300/19/8/422 |
work_keys_str_mv | AT brendonjbrewer computingentropieswithnestedsampling |