Computing Accurate Probabilistic Estimates of One-D Entropy from Equiprobable Random Samples

We develop a simple Quantile Spacing (QS) method for accurate probabilistic estimation of one-dimensional entropy from equiprobable random samples, and compare it with the popular Bin-Counting (BC) and Kernel Density (KD) methods. In contrast to BC, which uses equal-width bins with varying probabili...

Full description

Bibliographic Details
Main Authors: Hoshin V. Gupta, Mohammad Reza Ehsani, Tirthankar Roy, Maria A. Sans-Fuentes, Uwe Ehret, Ali Behrangi
Format: Article
Language:English
Published: MDPI AG 2021-06-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/23/6/740
_version_ 1797530391558488064
author Hoshin V. Gupta
Mohammad Reza Ehsani
Tirthankar Roy
Maria A. Sans-Fuentes
Uwe Ehret
Ali Behrangi
author_facet Hoshin V. Gupta
Mohammad Reza Ehsani
Tirthankar Roy
Maria A. Sans-Fuentes
Uwe Ehret
Ali Behrangi
author_sort Hoshin V. Gupta
collection DOAJ
description We develop a simple Quantile Spacing (QS) method for accurate probabilistic estimation of one-dimensional entropy from equiprobable random samples, and compare it with the popular Bin-Counting (BC) and Kernel Density (KD) methods. In contrast to BC, which uses equal-width bins with varying probability mass, the QS method uses estimates of the quantiles that divide the support of the data generating probability density function (pdf) into equal-probability-mass intervals. And, whereas BC and KD each require optimal tuning of a hyper-parameter whose value varies with sample size and shape of the pdf, QS only requires specification of the number of quantiles to be used. Results indicate, for the class of distributions tested, that the optimal number of quantiles is a <i>fixed fraction</i> of the sample size (empirically determined to be <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mo>~</mo><mn>0.25</mn><mo>–</mo><mn>0.35</mn></mrow></semantics></math></inline-formula>), and that this value is relatively insensitive to distributional form or sample size. This provides a clear advantage over BC and KD since hyper-parameter tuning is not required. Further, unlike KD, there is no need to select an appropriate kernel-type, and so QS is applicable to pdfs of arbitrary shape, including those with discontinuous slope and/or magnitude. Bootstrapping is used to approximate the sampling variability distribution of the resulting entropy estimate, and is shown to accurately reflect the true uncertainty. For the four distributional forms studied (<i>Gaussian</i>, <i>Log-Normal</i>, <i>Exponential</i> and <i>Bimodal Gaussian Mixture</i>), expected estimation bias is less than 1% and uncertainty is low even for samples of as few as <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>100</mn></mrow></semantics></math></inline-formula> data points; in contrast, for KD the small sample bias can be as large as <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mo>−</mo><mn>10</mn><mo>%</mo></mrow></semantics></math></inline-formula> and for BC as large as <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mo>−</mo><mn>50</mn><mo>%</mo></mrow></semantics></math></inline-formula>. We speculate that estimating quantile locations, rather than bin-probabilities, results in more efficient use of the information in the data to approximate the underlying shape of an unknown data generating pdf.
first_indexed 2024-03-10T10:29:16Z
format Article
id doaj.art-66d322f1f4ed461dbd9b4015f8621589
institution Directory Open Access Journal
issn 1099-4300
language English
last_indexed 2024-03-10T10:29:16Z
publishDate 2021-06-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj.art-66d322f1f4ed461dbd9b4015f86215892023-11-21T23:48:30ZengMDPI AGEntropy1099-43002021-06-0123674010.3390/e23060740Computing Accurate Probabilistic Estimates of One-D Entropy from Equiprobable Random SamplesHoshin V. Gupta0Mohammad Reza Ehsani1Tirthankar Roy2Maria A. Sans-Fuentes3Uwe Ehret4Ali Behrangi5Hydrology and Atmospheric Sciences, The University of Arizona, Tucson, AZ 85721, USAHydrology and Atmospheric Sciences, The University of Arizona, Tucson, AZ 85721, USACivil and Environmental Engineering, University of Nebraska-Lincoln, Omaha, NE 68182, USAGIDP Statistics and Data Science, The University of Arizona, Tucson, AZ 85721, USAInstitute of Water and River Basin Management, Karlsruhe Institute of Technology (KIT), 76131 Karlsruhe, GermanyHydrology and Atmospheric Sciences, The University of Arizona, Tucson, AZ 85721, USAWe develop a simple Quantile Spacing (QS) method for accurate probabilistic estimation of one-dimensional entropy from equiprobable random samples, and compare it with the popular Bin-Counting (BC) and Kernel Density (KD) methods. In contrast to BC, which uses equal-width bins with varying probability mass, the QS method uses estimates of the quantiles that divide the support of the data generating probability density function (pdf) into equal-probability-mass intervals. And, whereas BC and KD each require optimal tuning of a hyper-parameter whose value varies with sample size and shape of the pdf, QS only requires specification of the number of quantiles to be used. Results indicate, for the class of distributions tested, that the optimal number of quantiles is a <i>fixed fraction</i> of the sample size (empirically determined to be <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mo>~</mo><mn>0.25</mn><mo>–</mo><mn>0.35</mn></mrow></semantics></math></inline-formula>), and that this value is relatively insensitive to distributional form or sample size. This provides a clear advantage over BC and KD since hyper-parameter tuning is not required. Further, unlike KD, there is no need to select an appropriate kernel-type, and so QS is applicable to pdfs of arbitrary shape, including those with discontinuous slope and/or magnitude. Bootstrapping is used to approximate the sampling variability distribution of the resulting entropy estimate, and is shown to accurately reflect the true uncertainty. For the four distributional forms studied (<i>Gaussian</i>, <i>Log-Normal</i>, <i>Exponential</i> and <i>Bimodal Gaussian Mixture</i>), expected estimation bias is less than 1% and uncertainty is low even for samples of as few as <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>100</mn></mrow></semantics></math></inline-formula> data points; in contrast, for KD the small sample bias can be as large as <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mo>−</mo><mn>10</mn><mo>%</mo></mrow></semantics></math></inline-formula> and for BC as large as <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mo>−</mo><mn>50</mn><mo>%</mo></mrow></semantics></math></inline-formula>. We speculate that estimating quantile locations, rather than bin-probabilities, results in more efficient use of the information in the data to approximate the underlying shape of an unknown data generating pdf.https://www.mdpi.com/1099-4300/23/6/740entropyestimationquantile spacingaccuracyuncertaintybootstrap
spellingShingle Hoshin V. Gupta
Mohammad Reza Ehsani
Tirthankar Roy
Maria A. Sans-Fuentes
Uwe Ehret
Ali Behrangi
Computing Accurate Probabilistic Estimates of One-D Entropy from Equiprobable Random Samples
Entropy
entropy
estimation
quantile spacing
accuracy
uncertainty
bootstrap
title Computing Accurate Probabilistic Estimates of One-D Entropy from Equiprobable Random Samples
title_full Computing Accurate Probabilistic Estimates of One-D Entropy from Equiprobable Random Samples
title_fullStr Computing Accurate Probabilistic Estimates of One-D Entropy from Equiprobable Random Samples
title_full_unstemmed Computing Accurate Probabilistic Estimates of One-D Entropy from Equiprobable Random Samples
title_short Computing Accurate Probabilistic Estimates of One-D Entropy from Equiprobable Random Samples
title_sort computing accurate probabilistic estimates of one d entropy from equiprobable random samples
topic entropy
estimation
quantile spacing
accuracy
uncertainty
bootstrap
url https://www.mdpi.com/1099-4300/23/6/740
work_keys_str_mv AT hoshinvgupta computingaccurateprobabilisticestimatesofonedentropyfromequiprobablerandomsamples
AT mohammadrezaehsani computingaccurateprobabilisticestimatesofonedentropyfromequiprobablerandomsamples
AT tirthankarroy computingaccurateprobabilisticestimatesofonedentropyfromequiprobablerandomsamples
AT mariaasansfuentes computingaccurateprobabilisticestimatesofonedentropyfromequiprobablerandomsamples
AT uweehret computingaccurateprobabilisticestimatesofonedentropyfromequiprobablerandomsamples
AT alibehrangi computingaccurateprobabilisticestimatesofonedentropyfromequiprobablerandomsamples