From statistical inference to a differential learning rule for stochastic neural networks
- PMID: 30443331
- PMCID: PMC6227809
- DOI: 10.1098/rsfs.2018.0033
From statistical inference to a differential learning rule for stochastic neural networks
Abstract
Stochastic neural networks are a prototypical computational device able to build a probabilistic representation of an ensemble of external stimuli. Building on the relationship between inference and learning, we derive a synaptic plasticity rule that relies only on delayed activity correlations, and that shows a number of remarkable features. Our delayed-correlations matching (DCM) rule satisfies some basic requirements for biological feasibility: finite and noisy afferent signals, Dale's principle and asymmetry of synaptic connections, locality of the weight update computations. Nevertheless, the DCM rule is capable of storing a large, extensive number of patterns as attractors in a stochastic recurrent neural network, under general scenarios without requiring any modification: it can deal with correlated patterns, a broad range of architectures (with or without hidden neuronal states), one-shot learning with the palimpsest property, all the while avoiding the proliferation of spurious attractors. When hidden units are present, our learning rule can be employed to construct Boltzmann machine-like generative models, exploiting the addition of hidden neurons in feature extraction and classification tasks.
Keywords: associative memory; attractor networks; learning.
Conflict of interest statement
We declare we have no competing interests.
Figures






















Similar articles
-
Network capacity analysis for latent attractor computation.Network. 2003 May;14(2):273-302. Network. 2003. PMID: 12790185
-
Learning attractors in an asynchronous, stochastic electronic neural network.Network. 1998 May;9(2):183-205. doi: 10.1088/0954-898x/9/2/003. Network. 1998. PMID: 9861985
-
Memory dynamics in attractor networks with saliency weights.Neural Comput. 2010 Jul;22(7):1899-926. doi: 10.1162/neco.2010.07-09-1050. Neural Comput. 2010. PMID: 20235821
-
NeuCube: a spiking neural network architecture for mapping, learning and understanding of spatio-temporal brain data.Neural Netw. 2014 Apr;52:62-76. doi: 10.1016/j.neunet.2014.01.006. Epub 2014 Jan 20. Neural Netw. 2014. PMID: 24508754 Review.
-
Modelling the formation of working memory with networks of integrate-and-fire neurons connected by plastic synapses.J Physiol Paris. 2003 Jul-Nov;97(4-6):659-81. doi: 10.1016/j.jphysparis.2004.01.021. J Physiol Paris. 2003. PMID: 15242673 Review.
References
-
- Flight MH. 2008. Synaptic transmission: on the probability of release. Nat. Rev. Neurosci. 9, 736–737.
-
- Gerstner W, Kistler WM. 2002. Spiking neuron models: single neurons, populations, plasticity. Cambridge, UK: Cambridge University Press.
Associated data
LinkOut - more resources
Full Text Sources