On the Jensen-Shannon Symmetrization of Distances Relying on Abstract Means
- PMID: 33267199
- PMCID: PMC7514974
- DOI: 10.3390/e21050485
On the Jensen-Shannon Symmetrization of Distances Relying on Abstract Means
Abstract
The Jensen-Shannon divergence is a renowned bounded symmetrization of the unbounded Kullback-Leibler divergence which measures the total Kullback-Leibler divergence to the average mixture distribution. However, the Jensen-Shannon divergence between Gaussian distributions is not available in closed form. To bypass this problem, we present a generalization of the Jensen-Shannon (JS) divergence using abstract means which yields closed-form expressions when the mean is chosen according to the parametric family of distributions. More generally, we define the JS-symmetrizations of any distance using parameter mixtures derived from abstract means. In particular, we first show that the geometric mean is well-suited for exponential families, and report two closed-form formula for (i) the geometric Jensen-Shannon divergence between probability densities of the same exponential family; and (ii) the geometric JS-symmetrization of the reverse Kullback-Leibler divergence between probability densities of the same exponential family. As a second illustrating example, we show that the harmonic mean is well-suited for the scale Cauchy distributions, and report a closed-form formula for the harmonic Jensen-Shannon divergence between scale Cauchy distributions. Applications to clustering with respect to these novel Jensen-Shannon divergences are touched upon.
Keywords: Bhattacharyya distance; Bregman divergence; Cauchy scale family; Gaussian family; Jeffreys divergence; Jensen/Burbea–Rao divergence; Jensen–Shannon divergence; abstract weighted mean; clustering; exponential family; f-divergence; mixture family; quasi-arithmetic mean; resistor average distance; statistical M-mixture.
Conflict of interest statement
The author declares no conflict of interest.
Similar articles
-
On a Variational Definition for the Jensen-Shannon Symmetrization of Distances Based on the Information Radius.Entropy (Basel). 2021 Apr 14;23(4):464. doi: 10.3390/e23040464. Entropy (Basel). 2021. PMID: 33919986 Free PMC article.
-
On a Generalization of the Jensen-Shannon Divergence and the Jensen-Shannon Centroid.Entropy (Basel). 2020 Feb 16;22(2):221. doi: 10.3390/e22020221. Entropy (Basel). 2020. PMID: 33285995 Free PMC article.
-
Revisiting Chernoff Information with Likelihood Ratio Exponential Families.Entropy (Basel). 2022 Oct 1;24(10):1400. doi: 10.3390/e24101400. Entropy (Basel). 2022. PMID: 37420420 Free PMC article.
-
Information Geometry of κ-Exponential Families: Dually-Flat, Hessian and Legendre Structures.Entropy (Basel). 2018 Jun 5;20(6):436. doi: 10.3390/e20060436. Entropy (Basel). 2018. PMID: 33265526 Free PMC article. Review.
-
An Elementary Introduction to Information Geometry.Entropy (Basel). 2020 Sep 29;22(10):1100. doi: 10.3390/e22101100. Entropy (Basel). 2020. PMID: 33286868 Free PMC article. Review.
Cited by
-
Jeffreys Divergence and Generalized Fisher Information Measures on Fokker-Planck Space-Time Random Field.Entropy (Basel). 2023 Oct 13;25(10):1445. doi: 10.3390/e25101445. Entropy (Basel). 2023. PMID: 37895566 Free PMC article.
-
Multilevel irreversibility reveals higher-order organization of nonequilibrium interactions in human brain dynamics.Proc Natl Acad Sci U S A. 2025 Mar 11;122(10):e2408791122. doi: 10.1073/pnas.2408791122. Epub 2025 Mar 7. Proc Natl Acad Sci U S A. 2025. PMID: 40053364 Free PMC article.
-
α-Geodesical Skew Divergence.Entropy (Basel). 2021 Apr 25;23(5):528. doi: 10.3390/e23050528. Entropy (Basel). 2021. PMID: 33923103 Free PMC article.
-
An Information-Theoretic Measure for Balance Assessment in Comparative Clinical Studies.Entropy (Basel). 2020 Feb;22(2):218. doi: 10.3390/e22020218. Epub 2020 Feb 15. Entropy (Basel). 2020. PMID: 32116466 Free PMC article.
-
Using a topic model to map and analyze a large curriculum.PLoS One. 2023 Apr 20;18(4):e0284513. doi: 10.1371/journal.pone.0284513. eCollection 2023. PLoS One. 2023. PMID: 37079546 Free PMC article.
References
-
- Billingsley P. Probability and Measure. John Wiley & Sons; Hoboken, NJ, USA: 2008.
-
- Cover T.M., Thomas J.A. Elements of Information Theory. John Wiley & Sons; Hoboken, NJ, USA: 2012.
-
- Ho S.W., Yeung R.W. On the discontinuity of the Shannon information measures; Proceedings of the IEEE International Symposium on Information Theory (ISIT); Adelaide, Australia. 4–9 September 2005; pp. 159–163.
-
- Nielsen F. Jeffreys centroids: A closed-form expression for positive histograms and a guaranteed tight approximation for frequency histograms. IEEE Signal Process. Lett. 2013;20:657–660. doi: 10.1109/LSP.2013.2260538. - DOI
-
- Johnson D., Sinanovic S. Symmetrizing the Kullback-Leibler Distance. [(accessed on 11 May 2019)];2001 Technical report of Rice University (US) Available online: https://scholarship.rice.edu/handle/1911/19969.
LinkOut - more resources
Full Text Sources
Research Materials