Initialization and self-organized optimization of recurrent neural network connectivity
- PMID: 20357891
- PMCID: PMC2801534
- DOI: 10.2976/1.3240502
Initialization and self-organized optimization of recurrent neural network connectivity
Abstract
Reservoir computing (RC) is a recent paradigm in the field of recurrent neural networks. Networks in RC have a sparsely and randomly connected fixed hidden layer, and only output connections are trained. RC networks have recently received increased attention as a mathematical model for generic neural microcircuits to investigate and explain computations in neocortical columns. Applied to specific tasks, their fixed random connectivity, however, leads to significant variation in performance. Few problem-specific optimization procedures are known, which would be important for engineering applications, but also in order to understand how networks in biology are shaped to be optimally adapted to requirements of their environment. We study a general network initialization method using permutation matrices and derive a new unsupervised learning rule based on intrinsic plasticity (IP). The IP-based learning uses only local learning, and its aim is to improve network performance in a self-organized way. Using three different benchmarks, we show that networks with permutation matrices for the reservoir connectivity have much more persistent memory than the other methods but are also able to perform highly nonlinear mappings. We also show that IP-based on sigmoid transfer functions is limited concerning the output distributions that can be achieved.
Figures





Similar articles
-
Composing recurrent spiking neural networks using locally-recurrent motifs and risk-mitigating architectural optimization.Front Neurosci. 2024 Jun 20;18:1412559. doi: 10.3389/fnins.2024.1412559. eCollection 2024. Front Neurosci. 2024. PMID: 38966757 Free PMC article.
-
Contextual Integration in Cortical and Convolutional Neural Networks.Front Comput Neurosci. 2020 Apr 23;14:31. doi: 10.3389/fncom.2020.00031. eCollection 2020. Front Comput Neurosci. 2020. PMID: 32390818 Free PMC article.
-
Biologically plausible deep learning - But how far can we go with shallow networks?Neural Netw. 2019 Oct;118:90-101. doi: 10.1016/j.neunet.2019.06.001. Epub 2019 Jun 20. Neural Netw. 2019. PMID: 31254771
-
Basic concepts of artificial neural network (ANN) modeling and its application in pharmaceutical research.J Pharm Biomed Anal. 2000 Jun;22(5):717-27. doi: 10.1016/s0731-7085(99)00272-1. J Pharm Biomed Anal. 2000. PMID: 10815714 Review.
-
Recent advances in physical reservoir computing: A review.Neural Netw. 2019 Jul;115:100-123. doi: 10.1016/j.neunet.2019.03.005. Epub 2019 Mar 20. Neural Netw. 2019. PMID: 30981085 Review.
Cited by
-
Local Homeostatic Regulation of the Spectral Radius of Echo-State Networks.Front Comput Neurosci. 2021 Feb 24;15:587721. doi: 10.3389/fncom.2021.587721. eCollection 2021. Front Comput Neurosci. 2021. PMID: 33732127 Free PMC article.
-
Guided self-organization.HFSP J. 2009 Oct;3(5):287-9. doi: 10.1080/19552068.2009.9635816. Epub 2009 Oct 7. HFSP J. 2009. PMID: 20057962 Free PMC article. No abstract available.
-
Information processing in echo state networks at the edge of chaos.Theory Biosci. 2012 Sep;131(3):205-13. doi: 10.1007/s12064-011-0146-8. Epub 2011 Dec 7. Theory Biosci. 2012. PMID: 22147532
-
Dynamic Neural Fields with Intrinsic Plasticity.Front Comput Neurosci. 2017 Aug 31;11:74. doi: 10.3389/fncom.2017.00074. eCollection 2017. Front Comput Neurosci. 2017. PMID: 28912706 Free PMC article.
References
-
- Douglas, R, Markram, H, and Martin, K (2004). “Neocortex.” The Synaptic Organization of the Brain, 5th Ed., Shepard, G M, ed., pp. 499–558, Oxford University Press, New York.
-
- Doya, K (1995). “Recurrent networks: supervised learning.” The Handbook of Brain Theory and Neural Networks, Arbib, M A, ed., pp. 796–800, MIT Press, Cambridge, MA.
LinkOut - more resources
Full Text Sources