Application of offset estimator of differential entropy and mutual information with multivariate data
Numerical estimators of differential entropy and mutual information can be slow to converge as sample size increases. The offset Kozachenko–Leonenko (KLo) method described here implements an offset version of the Kozachenko–Leonenko estimator that can markedly improve convergence. Its use is illustr...
Main Authors: | Iván Marín-Franch, Martín Sanz-Sabater, David H. Foster, Emanuele Frontoni |
---|---|
Format: | Article |
Language: | English |
Published: |
Cambridge University Press
2022-01-01
|
Series: | Experimental Results |
Subjects: | |
Online Access: | https://www.cambridge.org/core/product/identifier/S2516712X22000144/type/journal_article |
Similar Items
-
Estimating Simultaneous Equation Models through an Entropy-Based Incremental Variational Bayes Learning Algorithm
by: Rocío Hernández-Sanjaime, et al.
Published: (2021-03-01) -
Mutual Information between Order Book Layers
by: Daniel Libman, et al.
Published: (2022-02-01) -
Non–Parametric Estimation of Mutual Information through the Entropy of the Linkage
by: Maria Teresa Giraudo, et al.
Published: (2013-11-01) -
MATLAB tool for probability density assessment and nonparametric estimation
by: Jenny Farmer, et al.
Published: (2022-06-01) -
Nonparametric Estimation of Information-Based Measures of Statistical Dispersion
by: Ondrej Pokora, et al.
Published: (2012-07-01)