On a Variational Definition for the Jensen-Shannon Symmetrization of Distances Based on the Information Radius

We generalize the Jensen-Shannon divergence and the Jensen-Shannon diversity index by considering a variational definition with respect to a generic mean, thereby extending the notion of Sibson’s information radius. The variational definition applies to any arbitrary distance and yields a new way to...

Full description

Bibliographic Details
Main Author: Frank Nielsen
Format: Article
Language:English
Published: MDPI AG 2021-04-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/23/4/464
Description
Summary:We generalize the Jensen-Shannon divergence and the Jensen-Shannon diversity index by considering a variational definition with respect to a generic mean, thereby extending the notion of Sibson’s information radius. The variational definition applies to any arbitrary distance and yields a new way to define a Jensen-Shannon symmetrization of distances. When the variational optimization is further constrained to belong to prescribed families of probability measures, we get relative Jensen-Shannon divergences and their equivalent Jensen-Shannon symmetrizations of distances that generalize the concept of information projections. Finally, we touch upon applications of these variational Jensen-Shannon divergences and diversity indices to clustering and quantization tasks of probability measures, including statistical mixtures.
ISSN:1099-4300