Optimality of the Plug-in Estimator for Differential Entropy Estimation under Gaussian Convolutions

© 2019 IEEE. This paper establishes the optimality of the plugin estimator for the problem of differential entropy estimation under Gaussian convolutions. Specifically, we consider the estimation of the differential entropy h(X + Z), where X and Z are independent d-dimensional random variables with...

Full description

Bibliographic Details
Main Authors: Goldfeld, Ziv, Greenewald, Kristjan, Weed, Jonathan, Polyanskiy, Yury
Other Authors: Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Format: Article
Language:English
Published: Institute of Electrical and Electronics Engineers (IEEE) 2021
Online Access:https://hdl.handle.net/1721.1/137042
_version_ 1811071971422633984
author Goldfeld, Ziv
Greenewald, Kristjan
Weed, Jonathan
Polyanskiy, Yury
author2 Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
author_facet Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Goldfeld, Ziv
Greenewald, Kristjan
Weed, Jonathan
Polyanskiy, Yury
author_sort Goldfeld, Ziv
collection MIT
description © 2019 IEEE. This paper establishes the optimality of the plugin estimator for the problem of differential entropy estimation under Gaussian convolutions. Specifically, we consider the estimation of the differential entropy h(X + Z), where X and Z are independent d-dimensional random variables with Z{\sim}\mathcal{N}( {0,{σ ^2}{{\text{I}}-d}} ). The distribution of X is unknown and belongs to some nonparametric class, but n independently and identically distributed samples from it are available. We first show that despite the regularizing effect of noise, any good estimator (within an additive gap) for this problem must have an exponential in d sample complexity. We then analyze the absolute-error risk of the plug-in estimator and show that it converges as frac{{{c^d}}}{{n }}, thus attaining the parametric estimation rate. This implies the optimality of the plug-in estimator for the considered problem. We provide numerical results comparing the performance of the plug-in estimator to general-purpose (unstructured) differential entropy estimators (based on kernel density estimation (KDE) or k nearest neighbors (kNN) techniques) applied to samples of X + Z. These results reveal a significant empirical superiority of the plug-in to state-of-the-art KDE- and kNN-based methods.
first_indexed 2024-09-23T08:58:51Z
format Article
id mit-1721.1/137042
institution Massachusetts Institute of Technology
language English
last_indexed 2024-09-23T08:58:51Z
publishDate 2021
publisher Institute of Electrical and Electronics Engineers (IEEE)
record_format dspace
spelling mit-1721.1/1370422022-09-30T12:35:30Z Optimality of the Plug-in Estimator for Differential Entropy Estimation under Gaussian Convolutions Goldfeld, Ziv Greenewald, Kristjan Weed, Jonathan Polyanskiy, Yury Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science © 2019 IEEE. This paper establishes the optimality of the plugin estimator for the problem of differential entropy estimation under Gaussian convolutions. Specifically, we consider the estimation of the differential entropy h(X + Z), where X and Z are independent d-dimensional random variables with Z{\sim}\mathcal{N}( {0,{σ ^2}{{\text{I}}-d}} ). The distribution of X is unknown and belongs to some nonparametric class, but n independently and identically distributed samples from it are available. We first show that despite the regularizing effect of noise, any good estimator (within an additive gap) for this problem must have an exponential in d sample complexity. We then analyze the absolute-error risk of the plug-in estimator and show that it converges as frac{{{c^d}}}{{n }}, thus attaining the parametric estimation rate. This implies the optimality of the plug-in estimator for the considered problem. We provide numerical results comparing the performance of the plug-in estimator to general-purpose (unstructured) differential entropy estimators (based on kernel density estimation (KDE) or k nearest neighbors (kNN) techniques) applied to samples of X + Z. These results reveal a significant empirical superiority of the plug-in to state-of-the-art KDE- and kNN-based methods. 2021-11-01T18:45:11Z 2021-11-01T18:45:11Z 2019-09 2021-04-15T16:05:35Z Article http://purl.org/eprint/type/ConferencePaper https://hdl.handle.net/1721.1/137042 Goldfeld, Ziv, Greenewald, Kristjan, Weed, Jonathan and Polyanskiy, Yury. 2019. "Optimality of the Plug-in Estimator for Differential Entropy Estimation under Gaussian Convolutions." IEEE International Symposium on Information Theory - Proceedings, 2019-July. en http://dx.doi.org/10.1109/ISIT.2019.8849414 IEEE International Symposium on Information Theory - Proceedings Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/ application/pdf Institute of Electrical and Electronics Engineers (IEEE) MIT web domain
spellingShingle Goldfeld, Ziv
Greenewald, Kristjan
Weed, Jonathan
Polyanskiy, Yury
Optimality of the Plug-in Estimator for Differential Entropy Estimation under Gaussian Convolutions
title Optimality of the Plug-in Estimator for Differential Entropy Estimation under Gaussian Convolutions
title_full Optimality of the Plug-in Estimator for Differential Entropy Estimation under Gaussian Convolutions
title_fullStr Optimality of the Plug-in Estimator for Differential Entropy Estimation under Gaussian Convolutions
title_full_unstemmed Optimality of the Plug-in Estimator for Differential Entropy Estimation under Gaussian Convolutions
title_short Optimality of the Plug-in Estimator for Differential Entropy Estimation under Gaussian Convolutions
title_sort optimality of the plug in estimator for differential entropy estimation under gaussian convolutions
url https://hdl.handle.net/1721.1/137042
work_keys_str_mv AT goldfeldziv optimalityofthepluginestimatorfordifferentialentropyestimationundergaussianconvolutions
AT greenewaldkristjan optimalityofthepluginestimatorfordifferentialentropyestimationundergaussianconvolutions
AT weedjonathan optimalityofthepluginestimatorfordifferentialentropyestimationundergaussianconvolutions
AT polyanskiyyury optimalityofthepluginestimatorfordifferentialentropyestimationundergaussianconvolutions