Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2019 May 11;21(5):485.
doi: 10.3390/e21050485.

On the Jensen-Shannon Symmetrization of Distances Relying on Abstract Means

Affiliations

On the Jensen-Shannon Symmetrization of Distances Relying on Abstract Means

Frank Nielsen. Entropy (Basel). .

Abstract

The Jensen-Shannon divergence is a renowned bounded symmetrization of the unbounded Kullback-Leibler divergence which measures the total Kullback-Leibler divergence to the average mixture distribution. However, the Jensen-Shannon divergence between Gaussian distributions is not available in closed form. To bypass this problem, we present a generalization of the Jensen-Shannon (JS) divergence using abstract means which yields closed-form expressions when the mean is chosen according to the parametric family of distributions. More generally, we define the JS-symmetrizations of any distance using parameter mixtures derived from abstract means. In particular, we first show that the geometric mean is well-suited for exponential families, and report two closed-form formula for (i) the geometric Jensen-Shannon divergence between probability densities of the same exponential family; and (ii) the geometric JS-symmetrization of the reverse Kullback-Leibler divergence between probability densities of the same exponential family. As a second illustrating example, we show that the harmonic mean is well-suited for the scale Cauchy distributions, and report a closed-form formula for the harmonic Jensen-Shannon divergence between scale Cauchy distributions. Applications to clustering with respect to these novel Jensen-Shannon divergences are touched upon.

Keywords: Bhattacharyya distance; Bregman divergence; Cauchy scale family; Gaussian family; Jeffreys divergence; Jensen/Burbea–Rao divergence; Jensen–Shannon divergence; abstract weighted mean; clustering; exponential family; f-divergence; mixture family; quasi-arithmetic mean; resistor average distance; statistical M-mixture.

PubMed Disclaimer

Conflict of interest statement

The author declares no conflict of interest.

Similar articles

Cited by

References

    1. Billingsley P. Probability and Measure. John Wiley & Sons; Hoboken, NJ, USA: 2008.
    1. Cover T.M., Thomas J.A. Elements of Information Theory. John Wiley & Sons; Hoboken, NJ, USA: 2012.
    1. Ho S.W., Yeung R.W. On the discontinuity of the Shannon information measures; Proceedings of the IEEE International Symposium on Information Theory (ISIT); Adelaide, Australia. 4–9 September 2005; pp. 159–163.
    1. Nielsen F. Jeffreys centroids: A closed-form expression for positive histograms and a guaranteed tight approximation for frequency histograms. IEEE Signal Process. Lett. 2013;20:657–660. doi: 10.1109/LSP.2013.2260538. - DOI
    1. Johnson D., Sinanovic S. Symmetrizing the Kullback-Leibler Distance. [(accessed on 11 May 2019)];2001 Technical report of Rice University (US) Available online: https://scholarship.rice.edu/handle/1911/19969.