Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Apr 29;26(5):381.
doi: 10.3390/e26050381.

In Search of Dispersed Memories: Generative Diffusion Models Are Associative Memory Networks

Affiliations

In Search of Dispersed Memories: Generative Diffusion Models Are Associative Memory Networks

Luca Ambrogioni. Entropy (Basel). .

Abstract

Uncovering the mechanisms behind long-term memory is one of the most fascinating open problems in neuroscience and artificial intelligence. Artificial associative memory networks have been used to formalize important aspects of biological memory. Generative diffusion models are a type of generative machine learning techniques that have shown great performance in many tasks. Similar to associative memory systems, these networks define a dynamical system that converges to a set of target states. In this work, we show that generative diffusion models can be interpreted as energy-based models and that, when trained on discrete patterns, their energy function is (asymptotically) identical to that of modern Hopfield networks. This equivalence allows us to interpret the supervised training of diffusion models as a synaptic learning process that encodes the associative dynamics of a modern Hopfield network in the weight structure of a deep neural network. Leveraging this connection, we formulate a generalized framework for understanding the formation of long-term memory, where creative generation and memory recall can be seen as parts of a unified continuum.

Keywords: associative memory networks; generative diffusion models; hopfield networks.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Visualization of different kinds of energy landscape and gradient vector fields corresponding to different forms of memory (in a two-dimensional space): (a) classical point-like memory; (b) extended localized memory; (c) non-localized (semantic) memory structure. The color denotes the probability density of the learned distribution while the lines represent the integral trajectories of the vector field oinduced by the score function.
Figure 2
Figure 2
Qualitative analysis of the (marginal) denoising trajectories of a binary associative memory problem with four patterns in a five-dimensional space. (a) Comparison between denoising trajectories of diffusion models and modern Hopfield updates. The diffusion curves are integrated using the Euler method with 2000 steps. The trajectories are overlaid to four modern Hopfield updates. (b) Comparison between exact and learned deterministic denoising trajectories. The colors are used to identify individual trajectories.
Figure 3
Figure 3
(a) Median error of exact diffusion model as function of the dimensionality. (b) Capacity of diffusion models and Hopfield networks in log scale. The shaded area denotes the estimated 95% intervals.

Similar articles

Cited by

References

    1. Hopfield J.J. Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. USA. 1982;79:2554–2558. doi: 10.1073/pnas.79.8.2554. - DOI - PMC - PubMed
    1. Michel A.N., Farrell J.A. Associative memories via artificial neural networks. IEEE Control Syst. Mag. 1990;10:6–17. doi: 10.1109/37.55118. - DOI
    1. Graves A., Wayne G., Reynolds M., Harley T., Danihelka I., Grabska-Barwińska A., Colmenarejo S.G., Grefenstette E., Ramalho T., Agapiou J., et al. Hybrid computing using a neural network with dynamic external memory. Nature. 2016;538:471–476. doi: 10.1038/nature20101. - DOI - PubMed
    1. Lopez-Paz D., Ranzato M. Gradient episodic memory for continual learning. Adv. Neural Inf. Process. Syst. 2017;30 doi: 10.48550/arXiv.1706.08840. - DOI
    1. Guo Y., Liu M., Yang T., Rosing T. Improved schemes for episodic memory-based lifelong learning. Adv. Neural Inf. Process. Syst. 2020;33:1023–1035.

LinkOut - more resources