Non–Parametric Estimation of Mutual Information through the Entropy of the Linkage

A new, non–parametric and binless estimator for the mutual information of a d–dimensional random vector is proposed. First of all, an equation that links the mutual information to the entropy of a suitable random vector with uniformly distributed components is deduced. When d = 2 this equation reduc...

Full description

Bibliographic Details
Main Authors: Maria Teresa Giraudo, Laura Sacerdote, Roberta Sirovich
Format: Article
Language:English
Published: MDPI AG 2013-11-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/15/12/5154
_version_ 1811184298129096704
author Maria Teresa Giraudo
Laura Sacerdote
Roberta Sirovich
author_facet Maria Teresa Giraudo
Laura Sacerdote
Roberta Sirovich
author_sort Maria Teresa Giraudo
collection DOAJ
description A new, non–parametric and binless estimator for the mutual information of a d–dimensional random vector is proposed. First of all, an equation that links the mutual information to the entropy of a suitable random vector with uniformly distributed components is deduced. When d = 2 this equation reduces to the well known connection between mutual information and entropy of the copula function associated to the original random variables. Hence, the problem of estimating the mutual information of the original random vector is reduced to the estimation of the entropy of a random vector obtained through a multidimensional transformation. The estimator we propose is a two–step method: first estimate the transformation and obtain the transformed sample, then estimate its entropy. The properties of the new estimator are discussed through simulation examples and its performances are compared to those of the best estimators in the literature. The precision of the estimator converges to values of the same order of magnitude of the best estimator tested. However, the new estimator is unbiased even for larger dimensions and smaller sample sizes, while the other tested estimators show a bias in these cases.
first_indexed 2024-04-11T13:09:51Z
format Article
id doaj.art-e481164cf899451386d68de863082de9
institution Directory Open Access Journal
issn 1099-4300
language English
last_indexed 2024-04-11T13:09:51Z
publishDate 2013-11-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj.art-e481164cf899451386d68de863082de92022-12-22T04:22:37ZengMDPI AGEntropy1099-43002013-11-0115125154517710.3390/e15125154e15125154Non–Parametric Estimation of Mutual Information through the Entropy of the LinkageMaria Teresa Giraudo0Laura Sacerdote1Roberta Sirovich2Department of Mathematics, University of Torino, Via Carlo Alberto 10, Torino 10123, ItalyDepartment of Mathematics, University of Torino, Via Carlo Alberto 10, Torino 10123, ItalyDepartment of Mathematics, University of Torino, Via Carlo Alberto 10, Torino 10123, ItalyA new, non–parametric and binless estimator for the mutual information of a d–dimensional random vector is proposed. First of all, an equation that links the mutual information to the entropy of a suitable random vector with uniformly distributed components is deduced. When d = 2 this equation reduces to the well known connection between mutual information and entropy of the copula function associated to the original random variables. Hence, the problem of estimating the mutual information of the original random vector is reduced to the estimation of the entropy of a random vector obtained through a multidimensional transformation. The estimator we propose is a two–step method: first estimate the transformation and obtain the transformed sample, then estimate its entropy. The properties of the new estimator are discussed through simulation examples and its performances are compared to those of the best estimators in the literature. The precision of the estimator converges to values of the same order of magnitude of the best estimator tested. However, the new estimator is unbiased even for larger dimensions and smaller sample sizes, while the other tested estimators show a bias in these cases.http://www.mdpi.com/1099-4300/15/12/5154information measuresmutual informationentropycopula functionlinkage functionkernel methodbinless estimator
spellingShingle Maria Teresa Giraudo
Laura Sacerdote
Roberta Sirovich
Non–Parametric Estimation of Mutual Information through the Entropy of the Linkage
Entropy
information measures
mutual information
entropy
copula function
linkage function
kernel method
binless estimator
title Non–Parametric Estimation of Mutual Information through the Entropy of the Linkage
title_full Non–Parametric Estimation of Mutual Information through the Entropy of the Linkage
title_fullStr Non–Parametric Estimation of Mutual Information through the Entropy of the Linkage
title_full_unstemmed Non–Parametric Estimation of Mutual Information through the Entropy of the Linkage
title_short Non–Parametric Estimation of Mutual Information through the Entropy of the Linkage
title_sort non parametric estimation of mutual information through the entropy of the linkage
topic information measures
mutual information
entropy
copula function
linkage function
kernel method
binless estimator
url http://www.mdpi.com/1099-4300/15/12/5154
work_keys_str_mv AT mariateresagiraudo nonparametricestimationofmutualinformationthroughtheentropyofthelinkage
AT laurasacerdote nonparametricestimationofmutualinformationthroughtheentropyofthelinkage
AT robertasirovich nonparametricestimationofmutualinformationthroughtheentropyofthelinkage