Information processing in echo state networks at the edge of chaos
- PMID: 22147532
- DOI: 10.1007/s12064-011-0146-8
Information processing in echo state networks at the edge of chaos
Abstract
We investigate information processing in randomly connected recurrent neural networks. It has been shown previously that the computational capabilities of these networks are maximized when the recurrent layer is close to the border between a stable and an unstable dynamics regime, the so called edge of chaos. The reasons, however, for this maximized performance are not completely understood. We adopt an information-theoretical framework and are for the first time able to quantify the computational capabilities between elements of these networks directly as they undergo the phase transition to chaos. Specifically, we present evidence that both information transfer and storage in the recurrent layer are maximized close to this phase transition, providing an explanation for why guiding the recurrent layer toward the edge of chaos is computationally useful. As a consequence, our study suggests self-organized ways of improving performance in recurrent neural networks, driven by input data. Moreover, the networks we study share important features with biological systems such as feedback connections and online computation on input streams. A key example is the cerebral cortex, which was shown to also operate close to the edge of chaos. Consequently, the behavior of model systems as studied here is likely to shed light on reasons why biological systems are tuned into this specific regime.
Similar articles
-
Real-time computation at the edge of chaos in recurrent neural networks.Neural Comput. 2004 Jul;16(7):1413-36. doi: 10.1162/089976604323057443. Neural Comput. 2004. PMID: 15165396
-
Coherent chaos in a recurrent neural network with structured connectivity.PLoS Comput Biol. 2018 Dec 13;14(12):e1006309. doi: 10.1371/journal.pcbi.1006309. eCollection 2018 Dec. PLoS Comput Biol. 2018. PMID: 30543634 Free PMC article.
-
Dynamics and Information Import in Recurrent Neural Networks.Front Comput Neurosci. 2022 Apr 27;16:876315. doi: 10.3389/fncom.2022.876315. eCollection 2022. Front Comput Neurosci. 2022. PMID: 35573264 Free PMC article.
-
How critical is brain criticality?Trends Neurosci. 2022 Nov;45(11):820-837. doi: 10.1016/j.tins.2022.08.007. Epub 2022 Sep 9. Trends Neurosci. 2022. PMID: 36096888 Review.
-
Effects of Hebbian learning on the dynamics and structure of random networks with inhibitory and excitatory neurons.J Physiol Paris. 2007 Jan-May;101(1-3):136-48. doi: 10.1016/j.jphysparis.2007.10.003. Epub 2007 Oct 16. J Physiol Paris. 2007. PMID: 18042357 Review.
Cited by
-
Functional advantages of Lévy walks emerging near a critical point.Proc Natl Acad Sci U S A. 2020 Sep 29;117(39):24336-24344. doi: 10.1073/pnas.2001548117. Epub 2020 Sep 14. Proc Natl Acad Sci U S A. 2020. PMID: 32929032 Free PMC article.
-
Information theoretic approaches for inference of biological networks from continuous-valued data.BMC Syst Biol. 2016 Sep 6;10(1):89. doi: 10.1186/s12918-016-0331-y. BMC Syst Biol. 2016. PMID: 27599566 Free PMC article.
-
Theory of Gating in Recurrent Neural Networks.Phys Rev X. 2022 Jan-Mar;12(1):011011. doi: 10.1103/physrevx.12.011011. Epub 2022 Jan 18. Phys Rev X. 2022. PMID: 36545030 Free PMC article.
-
On the Criticality of Adaptive Boolean Network Robots.Entropy (Basel). 2022 Sep 27;24(10):1368. doi: 10.3390/e24101368. Entropy (Basel). 2022. PMID: 37420388 Free PMC article.
-
Collective dynamics and long-range order in thermal neuristor networks.Nat Commun. 2024 Aug 14;15(1):6986. doi: 10.1038/s41467-024-51254-4. Nat Commun. 2024. PMID: 39143044 Free PMC article.
References
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
Research Materials