Where Does Minimum Error Entropy Outperform Minimum Mean Square Error? A New and Closer Look
The past decade has seen a rapid application of information theoretic learning (ITL) criteria in robust signal processing and machine learning problems. Generally, in ITL's literature, it is seen that, under non-Gaussian assumptions, especially when the data are corrupted by heavy-tailed or mul...
Main Authors: | Ahmad Reza Heravi, Ghosheh Abed Hodtani |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2018-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/8264748/ |
Similar Items
-
Permissible Area Analyses of Measurement Errors with Required Fault Diagnosability Performance
by: Dong-Nian Jiang, et al.
Published: (2019-11-01) -
Entropy and the Kullback–Leibler Divergence for Bayesian Networks: Computational Complexity and Efficient Implementation
by: Marco Scutari
Published: (2024-01-01) -
Correcting the Bias of the Root Mean Squared Error of Approximation Under Missing Data
by: Cailey E. Fitzgerald, et al.
Published: (2021-09-01) -
Relative Entropy and Minimum-Variance Pricing Kernel in Asset Pricing Model Evaluation
by: Javier Rojo-Suárez, et al.
Published: (2020-06-01) -
Entropy, Carnot Cycle, and Information Theory
by: Mario Martinelli
Published: (2018-12-01)