Where Does Minimum Error Entropy Outperform Minimum Mean Square Error? A New and Closer Look

The past decade has seen a rapid application of information theoretic learning (ITL) criteria in robust signal processing and machine learning problems. Generally, in ITL's literature, it is seen that, under non-Gaussian assumptions, especially when the data are corrupted by heavy-tailed or mul...

Full description

Bibliographic Details
Main Authors: Ahmad Reza Heravi, Ghosheh Abed Hodtani
Format: Article
Language:English
Published: IEEE 2018-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8264748/
_version_ 1818853913642663936
author Ahmad Reza Heravi
Ghosheh Abed Hodtani
author_facet Ahmad Reza Heravi
Ghosheh Abed Hodtani
author_sort Ahmad Reza Heravi
collection DOAJ
description The past decade has seen a rapid application of information theoretic learning (ITL) criteria in robust signal processing and machine learning problems. Generally, in ITL's literature, it is seen that, under non-Gaussian assumptions, especially when the data are corrupted by heavy-tailed or multi-modal non-Gaussian distributions, information theoretic criteria [such as minimum error entropy (MEE)] outperform second order statistical ones. The objective of this research is to investigate this better performance of MEE criterion against that of minimum mean square error. Having found similar results for MEEand MSE-based methods, in the non-Gaussian environment under particular conditions, we need a precise demarcation between this occasional similarity and occasional outperformance. Based on the theoretic findings, we reveal a better touchstone for the outperformance of MEE versus MSE.
first_indexed 2024-12-19T07:44:22Z
format Article
id doaj.art-ff2df474e3d641948e643bac8a543876
institution Directory Open Access Journal
issn 2169-3536
language English
last_indexed 2024-12-19T07:44:22Z
publishDate 2018-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj.art-ff2df474e3d641948e643bac8a5438762022-12-21T20:30:24ZengIEEEIEEE Access2169-35362018-01-0165856586410.1109/ACCESS.2018.27923298264748Where Does Minimum Error Entropy Outperform Minimum Mean Square Error? A New and Closer LookAhmad Reza Heravi0https://orcid.org/0000-0002-2481-5867Ghosheh Abed Hodtani1Electrical Engineering Department, Ferdowsi University of Mashhad, Mashhad, IranElectrical Engineering Department, Ferdowsi University of Mashhad, Mashhad, IranThe past decade has seen a rapid application of information theoretic learning (ITL) criteria in robust signal processing and machine learning problems. Generally, in ITL's literature, it is seen that, under non-Gaussian assumptions, especially when the data are corrupted by heavy-tailed or multi-modal non-Gaussian distributions, information theoretic criteria [such as minimum error entropy (MEE)] outperform second order statistical ones. The objective of this research is to investigate this better performance of MEE criterion against that of minimum mean square error. Having found similar results for MEEand MSE-based methods, in the non-Gaussian environment under particular conditions, we need a precise demarcation between this occasional similarity and occasional outperformance. Based on the theoretic findings, we reveal a better touchstone for the outperformance of MEE versus MSE.https://ieeexplore.ieee.org/document/8264748/Entropymean square error methodsmachine learning algorithmsinformation theoretic learningKullback-Leibler divergence
spellingShingle Ahmad Reza Heravi
Ghosheh Abed Hodtani
Where Does Minimum Error Entropy Outperform Minimum Mean Square Error? A New and Closer Look
IEEE Access
Entropy
mean square error methods
machine learning algorithms
information theoretic learning
Kullback-Leibler divergence
title Where Does Minimum Error Entropy Outperform Minimum Mean Square Error? A New and Closer Look
title_full Where Does Minimum Error Entropy Outperform Minimum Mean Square Error? A New and Closer Look
title_fullStr Where Does Minimum Error Entropy Outperform Minimum Mean Square Error? A New and Closer Look
title_full_unstemmed Where Does Minimum Error Entropy Outperform Minimum Mean Square Error? A New and Closer Look
title_short Where Does Minimum Error Entropy Outperform Minimum Mean Square Error? A New and Closer Look
title_sort where does minimum error entropy outperform minimum mean square error a new and closer look
topic Entropy
mean square error methods
machine learning algorithms
information theoretic learning
Kullback-Leibler divergence
url https://ieeexplore.ieee.org/document/8264748/
work_keys_str_mv AT ahmadrezaheravi wheredoesminimumerrorentropyoutperformminimummeansquareerroranewandcloserlook
AT ghoshehabedhodtani wheredoesminimumerrorentropyoutperformminimummeansquareerroranewandcloserlook