In Search of Dispersed Memories: Generative Diffusion Models Are Associative Memory Networks
- PMID: 38785630
- PMCID: PMC11119823
- DOI: 10.3390/e26050381
In Search of Dispersed Memories: Generative Diffusion Models Are Associative Memory Networks
Abstract
Uncovering the mechanisms behind long-term memory is one of the most fascinating open problems in neuroscience and artificial intelligence. Artificial associative memory networks have been used to formalize important aspects of biological memory. Generative diffusion models are a type of generative machine learning techniques that have shown great performance in many tasks. Similar to associative memory systems, these networks define a dynamical system that converges to a set of target states. In this work, we show that generative diffusion models can be interpreted as energy-based models and that, when trained on discrete patterns, their energy function is (asymptotically) identical to that of modern Hopfield networks. This equivalence allows us to interpret the supervised training of diffusion models as a synaptic learning process that encodes the associative dynamics of a modern Hopfield network in the weight structure of a deep neural network. Leveraging this connection, we formulate a generalized framework for understanding the formation of long-term memory, where creative generation and memory recall can be seen as parts of a unified continuum.
Keywords: associative memory networks; generative diffusion models; hopfield networks.
Conflict of interest statement
The authors declare no conflict of interest.
Figures



Similar articles
-
Universal Hopfield Networks: A General Framework for Single-Shot Associative Memory Models.Proc Mach Learn Res. 2022 Jul;162:15561-15583. Proc Mach Learn Res. 2022. PMID: 36751405 Free PMC article.
-
Associative memory realized by a reconfigurable memristive Hopfield neural network.Nat Commun. 2015 Jun 25;6:7522. doi: 10.1038/ncomms8522. Nat Commun. 2015. PMID: 26108993
-
Neural associative memory with optimal Bayesian learning.Neural Comput. 2011 Jun;23(6):1393-451. doi: 10.1162/NECO_a_00127. Epub 2011 Mar 11. Neural Comput. 2011. PMID: 21395440
-
Boltzmann Machines as Generalized Hopfield Networks: A Review of Recent Results and Outlooks.Entropy (Basel). 2020 Dec 29;23(1):34. doi: 10.3390/e23010034. Entropy (Basel). 2020. PMID: 33383716 Free PMC article. Review.
-
The memory systems of the human brain and generative artificial intelligence.Heliyon. 2024 May 24;10(11):e31965. doi: 10.1016/j.heliyon.2024.e31965. eCollection 2024 Jun 15. Heliyon. 2024. PMID: 38841455 Free PMC article. Review.
Cited by
-
The Statistical Thermodynamics of Generative Diffusion Models: Phase Transitions, Symmetry Breaking, and Critical Instability.Entropy (Basel). 2025 Mar 11;27(3):291. doi: 10.3390/e27030291. Entropy (Basel). 2025. PMID: 40149215 Free PMC article.
-
Explosive neural networks via higher-order interactions in curved statistical manifolds.Nat Commun. 2025 Jul 24;16(1):6511. doi: 10.1038/s41467-025-61475-w. Nat Commun. 2025. PMID: 40707463 Free PMC article.
References
-
- Michel A.N., Farrell J.A. Associative memories via artificial neural networks. IEEE Control Syst. Mag. 1990;10:6–17. doi: 10.1109/37.55118. - DOI
-
- Lopez-Paz D., Ranzato M. Gradient episodic memory for continual learning. Adv. Neural Inf. Process. Syst. 2017;30 doi: 10.48550/arXiv.1706.08840. - DOI
-
- Guo Y., Liu M., Yang T., Rosing T. Improved schemes for episodic memory-based lifelong learning. Adv. Neural Inf. Process. Syst. 2020;33:1023–1035.
LinkOut - more resources
Full Text Sources