Discriminant Analysis under <i>f</i>-Divergence Measures
In statistical inference, the information-theoretic performance limits can often be expressed in terms of a statistical divergence between the underlying statistical models (e.g., in binary hypothesis testing, the error probability is related to the total variation distance between the statistical m...
Main Authors: | Anmol Dwivedi, Sihui Wang, Ali Tajer |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-01-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/24/2/188 |
Similar Items
-
Inequalities for Jensen–Sharma–Mittal and Jeffreys–Sharma–Mittal Type <i>f</i>–Divergences
by: Paweł A. Kluza
Published: (2021-12-01) -
Some $f$-Divergence Measures Related to Jensen's One
by: Sever Dragomır
Published: (2023-12-01) -
Optimum Achievable Rates in Two Random Number Generation Problems with <i>f</i>-Divergences Using Smooth Rényi Entropy
by: Ryo Nomura, et al.
Published: (2024-09-01) -
Thurstonian Scaling for Sensory Discrimination Methods
by: Jian Bi, et al.
Published: (2025-01-01) -
F-Divergences and Cost Function Locality in Generative Modelling with Quantum Circuits
by: Chiara Leadbeater, et al.
Published: (2021-09-01)