Revisiting Chernoff Information with Likelihood Ratio Exponential Families
The Chernoff information between two probability measures is a statistical divergence measuring their deviation defined as their maximally skewed Bhattacharyya distance. Although the Chernoff information was originally introduced for bounding the Bayes error in statistical hypothesis testing, the di...
Main Author: | Frank Nielsen |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-10-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/24/10/1400 |
Similar Items
-
Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding
by: Wentao Huang, et al.
Published: (2019-03-01) -
Calibrating the Attack to Sensitivity in Differentially Private Mechanisms
by: Ayşe Ünsal, et al.
Published: (2022-10-01) -
ANALISIS KINERJA DOSEN JURUSAN MATEMATIKA FMIPA UNUD DENGAN METODE CHERNOFF FACES
by: GUSTI AYU MADE ARNA PUTRI, et al.
Published: (2012-09-01) -
UTILIZING CHERNOFF FACES IN MODELING RESPONSES IN THE EVALUATION OF TRIMESTER SCHEME IMPLEMENTATION
by: Rosie C Lopez-Conde, et al.
Published: (2022-01-01) -
POTRET KESEJAHTERAAN RAKYAT DI PROVINSI BALI MENGGUNAKAN METODE CHERNOFF FACES
by: I WAYAN WIDHI DIRGANTARA, et al.
Published: (2013-08-01)