Non–Parametric Estimation of Mutual Information through the Entropy of the Linkage
A new, non–parametric and binless estimator for the mutual information of a d–dimensional random vector is proposed. First of all, an equation that links the mutual information to the entropy of a suitable random vector with uniformly distributed components is deduced. When d = 2 this equation reduc...
Main Authors: | Maria Teresa Giraudo, Laura Sacerdote, Roberta Sirovich |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2013-11-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/15/12/5154 |
Similar Items
-
Mutual Information between Order Book Layers
by: Daniel Libman, et al.
Published: (2022-02-01) -
Information Entropy Suggests Stronger Nonlinear Associations between Hydro-Meteorological Variables and ENSO
by: Tue M. Vu, et al.
Published: (2018-01-01) -
Error Exponents and <i>α</i>-Mutual Information
by: Sergio Verdú
Published: (2021-02-01) -
Reviewing Evolution of Learning Functions and Semantic Information Measures for Understanding Deep Learning
by: Chenguang Lu
Published: (2023-05-01) -
Empirical Estimation of Information Measures: A Literature Guide
by: Sergio Verdú
Published: (2019-07-01)