Synaptic plasticity as Bayesian inference
- PMID: 33707754
- PMCID: PMC7617048
- DOI: 10.1038/s41593-021-00809-5
Synaptic plasticity as Bayesian inference
Abstract
Learning, especially rapid learning, is critical for survival. However, learning is hard; a large number of synaptic weights must be set based on noisy, often ambiguous, sensory information. In such a high-noise regime, keeping track of probability distributions over weights is the optimal strategy. Here we hypothesize that synapses take that strategy; in essence, when they estimate weights, they include error bars. They then use that uncertainty to adjust their learning rates, with more uncertain weights having higher learning rates. We also make a second, independent, hypothesis: synapses communicate their uncertainty by linking it to variability in postsynaptic potential size, with more uncertainty leading to more variability. These two hypotheses cast synaptic plasticity as a problem of Bayesian inference, and thus provide a normative view of learning. They generalize known learning rules, offer an explanation for the large variability in the size of postsynaptic potentials and make falsifiable experimental predictions.
Conflict of interest statement
The authors declare no competing interests.
Figures





References
-
- Poggio T. Cold Spring Harbor Symposia on Quantitative Biology. Vol. 55. Cold Spring Harbor Laboratory Press; 1990. A theory of how the brain might work. - PubMed
-
- Knill DC, Richards W. Perception as Bayesian Inference. Cambridge University Press; 1996.
-
- Aitchison L. Bayesian filtering unifies adaptive and non-adaptive neural network optimization methods. NeurIPS. 2020
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
Other Literature Sources
Miscellaneous