A Note on the W-S Lower Bound of the MEE Estimation

The minimum error entropy (MEE) estimation is concerned with the estimation of a certain random variable (unknown variable) based on another random variable (observation), so that the entropy of the estimation error is minimized. This estimation method may outperform the well-known minimum mean squa...

Full description

Bibliographic Details
Main Authors: Badong Chen, Guangmin Wang, Nanning Zheng, Jose C. Principe
Format: Article
Language:English
Published: MDPI AG 2014-02-01
Series:Entropy
Subjects:
Online Access:http://www.mdpi.com/1099-4300/16/2/814
Description
Summary:The minimum error entropy (MEE) estimation is concerned with the estimation of a certain random variable (unknown variable) based on another random variable (observation), so that the entropy of the estimation error is minimized. This estimation method may outperform the well-known minimum mean square error (MMSE) estimation especially for non-Gaussian situations. There is an important performance bound on the MEE estimation, namely the W-S lower bound, which is computed as the conditional entropy of the unknown variable given observation. Though it has been known in the literature for a considerable time, up to now there is little study on this performance bound. In this paper, we reexamine the W-S lower bound. Some basic properties of the W-S lower bound are presented, and the characterization of Gaussian distribution using the W-S lower bound is investigated.
ISSN:1099-4300