A Review of Shannon and Differential Entropy Rate Estimation

In this paper, we present a review of Shannon and differential entropy rate estimation techniques. Entropy rate, which measures the average information gain from a stochastic process, is a measure of uncertainty and complexity of a stochastic process. We discuss the estimation of entropy rate from e...

Full description

Bibliographic Details
Main Authors: Andrew Feutrill, Matthew Roughan
Format: Article
Language:English
Published: MDPI AG 2021-08-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/23/8/1046
_version_ 1797523910349029376
author Andrew Feutrill
Matthew Roughan
author_facet Andrew Feutrill
Matthew Roughan
author_sort Andrew Feutrill
collection DOAJ
description In this paper, we present a review of Shannon and differential entropy rate estimation techniques. Entropy rate, which measures the average information gain from a stochastic process, is a measure of uncertainty and complexity of a stochastic process. We discuss the estimation of entropy rate from empirical data, and review both parametric and non-parametric techniques. We look at many different assumptions on properties of the processes for parametric processes, in particular focussing on Markov and Gaussian assumptions. Non-parametric estimation relies on limit theorems which involve the entropy rate from observations, and to discuss these, we introduce some theory and the practical implementations of estimators of this type.
first_indexed 2024-03-10T08:49:49Z
format Article
id doaj.art-8052f24e17e9431d9c063306cb25198c
institution Directory Open Access Journal
issn 1099-4300
language English
last_indexed 2024-03-10T08:49:49Z
publishDate 2021-08-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj.art-8052f24e17e9431d9c063306cb25198c2023-11-22T07:35:31ZengMDPI AGEntropy1099-43002021-08-01238104610.3390/e23081046A Review of Shannon and Differential Entropy Rate EstimationAndrew Feutrill0Matthew Roughan1CSIRO/Data61, 13 Kintore Avenue, Adelaide, SA 5000, AustraliaSchool of Mathematical Sciences, The University of Adelaide, Adelaide, SA 5005, AustraliaIn this paper, we present a review of Shannon and differential entropy rate estimation techniques. Entropy rate, which measures the average information gain from a stochastic process, is a measure of uncertainty and complexity of a stochastic process. We discuss the estimation of entropy rate from empirical data, and review both parametric and non-parametric techniques. We look at many different assumptions on properties of the processes for parametric processes, in particular focussing on Markov and Gaussian assumptions. Non-parametric estimation relies on limit theorems which involve the entropy rate from observations, and to discuss these, we introduce some theory and the practical implementations of estimators of this type.https://www.mdpi.com/1099-4300/23/8/1046entropy rateestimationparametricnon-parametric
spellingShingle Andrew Feutrill
Matthew Roughan
A Review of Shannon and Differential Entropy Rate Estimation
Entropy
entropy rate
estimation
parametric
non-parametric
title A Review of Shannon and Differential Entropy Rate Estimation
title_full A Review of Shannon and Differential Entropy Rate Estimation
title_fullStr A Review of Shannon and Differential Entropy Rate Estimation
title_full_unstemmed A Review of Shannon and Differential Entropy Rate Estimation
title_short A Review of Shannon and Differential Entropy Rate Estimation
title_sort review of shannon and differential entropy rate estimation
topic entropy rate
estimation
parametric
non-parametric
url https://www.mdpi.com/1099-4300/23/8/1046
work_keys_str_mv AT andrewfeutrill areviewofshannonanddifferentialentropyrateestimation
AT matthewroughan areviewofshannonanddifferentialentropyrateestimation
AT andrewfeutrill reviewofshannonanddifferentialentropyrateestimation
AT matthewroughan reviewofshannonanddifferentialentropyrateestimation