On the Jensen-Shannon Symmetrization of Distances Relying on Abstract Means
- PMID: 33267199
- PMCID: PMC7514974
- DOI: 10.3390/e21050485
On the Jensen-Shannon Symmetrization of Distances Relying on Abstract Means
Abstract
The Jensen-Shannon divergence is a renowned bounded symmetrization of the unbounded Kullback-Leibler divergence which measures the total Kullback-Leibler divergence to the average mixture distribution. However, the Jensen-Shannon divergence between Gaussian distributions is not available in closed form. To bypass this problem, we present a generalization of the Jensen-Shannon (JS) divergence using abstract means which yields closed-form expressions when the mean is chosen according to the parametric family of distributions. More generally, we define the JS-symmetrizations of any distance using parameter mixtures derived from abstract means. In particular, we first show that the geometric mean is well-suited for exponential families, and report two closed-form formula for (i) the geometric Jensen-Shannon divergence between probability densities of the same exponential family; and (ii) the geometric JS-symmetrization of the reverse Kullback-Leibler divergence between probability densities of the same exponential family. As a second illustrating example, we show that the harmonic mean is well-suited for the scale Cauchy distributions, and report a closed-form formula for the harmonic Jensen-Shannon divergence between scale Cauchy distributions. Applications to clustering with respect to these novel Jensen-Shannon divergences are touched upon.
Keywords: Bhattacharyya distance; Bregman divergence; Cauchy scale family; Gaussian family; Jeffreys divergence; Jensen/Burbea–Rao divergence; Jensen–Shannon divergence; abstract weighted mean; clustering; exponential family; f-divergence; mixture family; quasi-arithmetic mean; resistor average distance; statistical M-mixture.
Conflict of interest statement
The author declares no conflict of interest.
References
-
- Billingsley P. Probability and Measure. John Wiley & Sons; Hoboken, NJ, USA: 2008.
-
- Cover T.M., Thomas J.A. Elements of Information Theory. John Wiley & Sons; Hoboken, NJ, USA: 2012.
-
- Ho S.W., Yeung R.W. On the discontinuity of the Shannon information measures; Proceedings of the IEEE International Symposium on Information Theory (ISIT); Adelaide, Australia. 4–9 September 2005; pp. 159–163.
-
- Nielsen F. Jeffreys centroids: A closed-form expression for positive histograms and a guaranteed tight approximation for frequency histograms. IEEE Signal Process. Lett. 2013;20:657–660. doi: 10.1109/LSP.2013.2260538. - DOI
-
- Johnson D., Sinanovic S. Symmetrizing the Kullback-Leibler Distance. [(accessed on 11 May 2019)];2001 Technical report of Rice University (US) Available online: https://scholarship.rice.edu/handle/1911/19969.
LinkOut - more resources
Full Text Sources
Research Materials
