On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means
The Jensen–Shannon divergence is a renowned bounded symmetrization of the unbounded Kullback–Leibler divergence which measures the total Kullback–Leibler divergence to the average mixture distribution. However, the Jensen–Shannon divergence between Gaussian di...
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2019-05-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/21/5/485 |
_version_ | 1817990156769034240 |
---|---|
author | Frank Nielsen |
author_facet | Frank Nielsen |
author_sort | Frank Nielsen |
collection | DOAJ |
description | The Jensen–Shannon divergence is a renowned bounded symmetrization of the unbounded Kullback–Leibler divergence which measures the total Kullback–Leibler divergence to the average mixture distribution. However, the Jensen–Shannon divergence between Gaussian distributions is not available in closed form. To bypass this problem, we present a generalization of the Jensen–Shannon (JS) divergence using abstract means which yields closed-form expressions when the mean is chosen according to the parametric family of distributions. More generally, we define the JS-symmetrizations of any distance using parameter mixtures derived from abstract means. In particular, we first show that the geometric mean is well-suited for exponential families, and report two closed-form formula for (i) the geometric Jensen–Shannon divergence between probability densities of the same exponential family; and (ii) the geometric JS-symmetrization of the reverse Kullback–Leibler divergence between probability densities of the same exponential family. As a second illustrating example, we show that the harmonic mean is well-suited for the scale Cauchy distributions, and report a closed-form formula for the harmonic Jensen–Shannon divergence between scale Cauchy distributions. Applications to clustering with respect to these novel Jensen–Shannon divergences are touched upon. |
first_indexed | 2024-04-14T00:55:10Z |
format | Article |
id | doaj.art-17342b1be27947fe9a72db4e12df4e1f |
institution | Directory Open Access Journal |
issn | 1099-4300 |
language | English |
last_indexed | 2024-04-14T00:55:10Z |
publishDate | 2019-05-01 |
publisher | MDPI AG |
record_format | Article |
series | Entropy |
spelling | doaj.art-17342b1be27947fe9a72db4e12df4e1f2022-12-22T02:21:38ZengMDPI AGEntropy1099-43002019-05-0121548510.3390/e21050485e21050485On the Jensen–Shannon Symmetrization of Distances Relying on Abstract MeansFrank Nielsen0Sony Computer Science Laboratories, Takanawa Muse Bldg., 3-14-13, Higashigotanda, Shinagawa-ku, Tokyo 141-0022, JapanThe Jensen–Shannon divergence is a renowned bounded symmetrization of the unbounded Kullback–Leibler divergence which measures the total Kullback–Leibler divergence to the average mixture distribution. However, the Jensen–Shannon divergence between Gaussian distributions is not available in closed form. To bypass this problem, we present a generalization of the Jensen–Shannon (JS) divergence using abstract means which yields closed-form expressions when the mean is chosen according to the parametric family of distributions. More generally, we define the JS-symmetrizations of any distance using parameter mixtures derived from abstract means. In particular, we first show that the geometric mean is well-suited for exponential families, and report two closed-form formula for (i) the geometric Jensen–Shannon divergence between probability densities of the same exponential family; and (ii) the geometric JS-symmetrization of the reverse Kullback–Leibler divergence between probability densities of the same exponential family. As a second illustrating example, we show that the harmonic mean is well-suited for the scale Cauchy distributions, and report a closed-form formula for the harmonic Jensen–Shannon divergence between scale Cauchy distributions. Applications to clustering with respect to these novel Jensen–Shannon divergences are touched upon.https://www.mdpi.com/1099-4300/21/5/485Jensen–Shannon divergenceJeffreys divergenceresistor average distanceBhattacharyya distancef-divergenceJensen/Burbea–Rao divergenceBregman divergenceabstract weighted meanquasi-arithmetic meanmixture familystatistical M-mixtureexponential familyGaussian familyCauchy scale familyclustering |
spellingShingle | Frank Nielsen On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means Entropy Jensen–Shannon divergence Jeffreys divergence resistor average distance Bhattacharyya distance f-divergence Jensen/Burbea–Rao divergence Bregman divergence abstract weighted mean quasi-arithmetic mean mixture family statistical M-mixture exponential family Gaussian family Cauchy scale family clustering |
title | On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means |
title_full | On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means |
title_fullStr | On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means |
title_full_unstemmed | On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means |
title_short | On the Jensen–Shannon Symmetrization of Distances Relying on Abstract Means |
title_sort | on the jensen shannon symmetrization of distances relying on abstract means |
topic | Jensen–Shannon divergence Jeffreys divergence resistor average distance Bhattacharyya distance f-divergence Jensen/Burbea–Rao divergence Bregman divergence abstract weighted mean quasi-arithmetic mean mixture family statistical M-mixture exponential family Gaussian family Cauchy scale family clustering |
url | https://www.mdpi.com/1099-4300/21/5/485 |
work_keys_str_mv | AT franknielsen onthejensenshannonsymmetrizationofdistancesrelyingonabstractmeans |