Empirical Estimation of Information Measures: A Literature Guide
We give a brief survey of the literature on the empirical estimation of entropy, differential entropy, relative entropy, mutual information and related information measures. While those quantities are of central importance in information theory, universal algorithms for their estimation are increasi...
Main Author: | Sergio Verdú |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2019-07-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/21/8/720 |
Similar Items
-
Mutual Information between Order Book Layers
by: Daniel Libman, et al.
Published: (2022-02-01) -
Error Exponents and <i>α</i>-Mutual Information
by: Sergio Verdú
Published: (2021-02-01) -
Conditional Rényi Divergence Saddlepoint and the Maximization of <i>α</i>-Mutual Information
by: Changxiao Cai, et al.
Published: (2019-10-01) -
On Generalized Schürmann Entropy Estimators
by: Peter Grassberger
Published: (2022-05-01) -
Reviewing Evolution of Learning Functions and Semantic Information Measures for Understanding Deep Learning
by: Chenguang Lu
Published: (2023-05-01)