Estimating mutual information
- PMID: 15244698
- DOI: 10.1103/PhysRevE.69.066138
Estimating mutual information
Erratum in
- Phys Rev E Stat Nonlin Soft Matter Phys. 2011 Jan;83(1 Pt 1):019903
Abstract
We present two classes of improved estimators for mutual information M(X,Y), from samples of random points distributed according to some joint probability density mu(x,y). In contrast to conventional estimators based on binnings, they are based on entropy estimates from k -nearest neighbor distances. This means that they are data efficient (with k=1 we resolve structures down to the smallest possible scales), adaptive (the resolution is higher where data are more numerous), and have minimal bias. Indeed, the bias of the underlying entropy estimates is mainly due to nonuniformity of the density at the smallest resolved scale, giving typically systematic errors which scale as functions of k/N for N points. Numerically, we find that both families become exact for independent distributions, i.e. the estimator M(X,Y) vanishes (up to statistical fluctuations) if mu(x,y)=mu(x)mu(y). This holds for all tested marginal distributions and for all dimensions of x and y. In addition, we give estimators for redundancies between more than two random variables. We compare our algorithms in detail with existing algorithms. Finally, we demonstrate the usefulness of our estimators for assessing the actual independence of components obtained from independent component analysis (ICA), for improving ICA, and for estimating the reliability of blind source separation.
Similar articles
-
Minimax mutual information approach for independent component analysis.Neural Comput. 2004 Jun;16(6):1235-52. doi: 10.1162/089976604773717595. Neural Comput. 2004. PMID: 15130248
-
Least-dependent-component analysis based on mutual information.Phys Rev E Stat Nonlin Soft Matter Phys. 2004 Dec;70(6 Pt 2):066123. doi: 10.1103/PhysRevE.70.066123. Epub 2004 Dec 13. Phys Rev E Stat Nonlin Soft Matter Phys. 2004. PMID: 15697450
-
Nonparametric k-nearest-neighbor entropy estimator.Phys Rev E. 2016 Jan;93(1):013310. doi: 10.1103/PhysRevE.93.013310. Epub 2016 Jan 21. Phys Rev E. 2016. PMID: 26871193
-
Bias reduction in the estimation of mutual information.Phys Rev E Stat Nonlin Soft Matter Phys. 2014 Nov;90(5-1):052714. doi: 10.1103/PhysRevE.90.052714. Epub 2014 Nov 17. Phys Rev E Stat Nonlin Soft Matter Phys. 2014. PMID: 25493823
-
The properties of bio-energy transport and influence of structure nonuniformity and temperature of systems on energy transport along polypeptide chains.Prog Biophys Mol Biol. 2012 Jan;108(1-2):1-46. doi: 10.1016/j.pbiomolbio.2011.09.005. Epub 2011 Sep 17. Prog Biophys Mol Biol. 2012. PMID: 21951575 Review.
Cited by
-
Information Bottleneck Driven Deep Video Compression-IBOpenDVCW.Entropy (Basel). 2024 Sep 30;26(10):836. doi: 10.3390/e26100836. Entropy (Basel). 2024. PMID: 39451913 Free PMC article.
-
Allosteric pathways in imidazole glycerol phosphate synthase.Proc Natl Acad Sci U S A. 2012 May 29;109(22):E1428-36. doi: 10.1073/pnas.1120536109. Epub 2012 May 14. Proc Natl Acad Sci U S A. 2012. PMID: 22586084 Free PMC article.
-
Carving Nature at Its Joints: A Comparison of CEMI Field Theory with Integrated Information Theory and Global Workspace Theory.Entropy (Basel). 2023 Dec 8;25(12):1635. doi: 10.3390/e25121635. Entropy (Basel). 2023. PMID: 38136515 Free PMC article.
-
Nonlinear optical encoding enabled by recurrent linear scattering.Nat Photonics. 2024;18(10):1067-1075. doi: 10.1038/s41566-024-01493-0. Epub 2024 Jul 31. Nat Photonics. 2024. PMID: 39372105 Free PMC article.
-
Mutual information rate and bounds for it.PLoS One. 2012;7(10):e46745. doi: 10.1371/journal.pone.0046745. Epub 2012 Oct 24. PLoS One. 2012. PMID: 23112809 Free PMC article.
LinkOut - more resources
Full Text Sources
Other Literature Sources
Research Materials
Miscellaneous