Revisiting Chernoff Information with Likelihood Ratio Exponential Families

The Chernoff information between two probability measures is a statistical divergence measuring their deviation defined as their maximally skewed Bhattacharyya distance. Although the Chernoff information was originally introduced for bounding the Bayes error in statistical hypothesis testing, the di...

Full description

Bibliographic Details
Main Author: Frank Nielsen
Format: Article
Language:English
Published: MDPI AG 2022-10-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/24/10/1400