A sparse quantized hopfield network for online-continual memory
- PMID: 38697981
- PMCID: PMC11065890
- DOI: 10.1038/s41467-024-46976-4
A sparse quantized hopfield network for online-continual memory
Abstract
An important difference between brains and deep neural networks is the way they learn. Nervous systems learn online where a stream of noisy data points are presented in a non-independent, identically distributed way. Further, synaptic plasticity in the brain depends only on information local to synapses. Deep networks, on the other hand, typically use non-local learning algorithms and are trained in an offline, non-noisy, independent, identically distributed setting. Understanding how neural networks learn under the same constraints as the brain is an open problem for neuroscience and neuromorphic computing. A standard approach to this problem has yet to be established. In this paper, we propose that discrete graphical models that learn via an online maximum a posteriori learning algorithm could provide such an approach. We implement this kind of model in a neural network called the Sparse Quantized Hopfield Network. We show our model outperforms state-of-the-art neural networks on associative memory tasks, outperforms these networks in online, continual settings, learns efficiently with noisy inputs, and is better than baselines on an episodic memory task.
© 2024. The Author(s).
Conflict of interest statement
The authors declare no competing interests.
Figures





Similar articles
-
Competitive plasticity to reduce the energetic costs of learning.PLoS Comput Biol. 2024 Oct 28;20(10):e1012553. doi: 10.1371/journal.pcbi.1012553. eCollection 2024 Oct. PLoS Comput Biol. 2024. PMID: 39466853 Free PMC article.
-
Developmental Plasticity-Inspired Adaptive Pruning for Deep Spiking and Artificial Neural Networks.IEEE Trans Pattern Anal Mach Intell. 2025 Jan;47(1):240-251. doi: 10.1109/TPAMI.2024.3467268. Epub 2024 Dec 4. IEEE Trans Pattern Anal Mach Intell. 2025. PMID: 39316493
-
Similarity-based context aware continual learning for spiking neural networks.Neural Netw. 2025 Apr;184:107037. doi: 10.1016/j.neunet.2024.107037. Epub 2024 Dec 12. Neural Netw. 2025. PMID: 39708703
-
Memory capacities for synaptic and structural plasticity.Neural Comput. 2010 Feb;22(2):289-341. doi: 10.1162/neco.2009.08-07-588. Neural Comput. 2010. PMID: 19925281 Review.
-
A review of learning in biologically plausible spiking neural networks.Neural Netw. 2020 Feb;122:253-272. doi: 10.1016/j.neunet.2019.09.036. Epub 2019 Oct 11. Neural Netw. 2020. PMID: 31726331 Review.
References
-
- Davies M, et al. Loihi: a neuromorphic manycore processor with on-chip learning. IEEE Micro. 2018;38:82–99. doi: 10.1109/MM.2018.112130359. - DOI
-
- Rumelhart, D. E., Durbin, R., Golden, R. & Chauvin, Y. In Backpropagation: Theory, Architectures and Applications (eds Rumelhart, D. E. & Chauvin, Y.) Ch. 1 (Lawrence Erlbaum Associates, 1995).
-
- Stork. Is backpropagation biologically plausible?In International 1989 Joint Conference on Neural Networks 241–246 (IEEE, 1989).
Publication types
MeSH terms
Grants and funding
- FA9550-19-1- 0306/United States Department of Defense | United States Air Force | AFMC | Air Force Office of Scientific Research (AF Office of Scientific Research)
- ID 2024633/National Science Foundation (NSF)
- FA9550-19-1- 0306/United States Department of Defense | United States Air Force | AFMC | Air Force Office of Scientific Research (AF Office of Scientific Research)
- ID 2024633/National Science Foundation (NSF)