Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2020 Dec 29;23(1):34.
doi: 10.3390/e23010034.

Boltzmann Machines as Generalized Hopfield Networks: A Review of Recent Results and Outlooks

Affiliations
Review

Boltzmann Machines as Generalized Hopfield Networks: A Review of Recent Results and Outlooks

Chiara Marullo et al. Entropy (Basel). .

Abstract

The Hopfield model and the Boltzmann machine are among the most popular examples of neural networks. The latter, widely used for classification and feature detection, is able to efficiently learn a generative model from observed data and constitutes the benchmark for statistical learning. The former, designed to mimic the retrieval phase of an artificial associative memory lays in between two paradigmatic statistical mechanics models, namely the Curie-Weiss and the Sherrington-Kirkpatrick, which are recovered as the limiting cases of, respectively, one and many stored memories. Interestingly, the Boltzmann machine and the Hopfield network, if considered to be two cognitive processes (learning and information retrieval), are nothing more than two sides of the same coin. In fact, it is possible to exactly map the one into the other. We will inspect such an equivalence retracing the most representative steps of the research in this field.

Keywords: boltzmann machine; hopfield model; statistical mechanics of disordered systems.

PubMed Disclaimer

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
Schematic representation of the equivalence between a two-layer HRBM (left) and an Hopfield network (right). Please note that the size of the visible layer (here N=6) and of the hidden layer (here P=3) in the former correspond, respectively, to the size and to the number of stored patterns in the latter.
Figure 2
Figure 2
Phase diagram of a generalized RBM for varying pattern, hidden and visible unit priors as found in [12]. In all plots, the yellow region represents the ergodic phase, the red region represents the spin-glass phase, and the blue region represents the retrieval phase. First line: the visible units are Boolean and Ωz=1. In this case the retrieval region approaches the line α=0 and T[0,Ωz] as δ0. Second line: the visible variables are Boolean and δ=1. The retrieval region approaches the line T=0 and α[0,αc(δ)] as Ωz0. Third line: δ=1, Ωz=1 and the soft visible units are regularized with a spherical constraint. As Ωσ0 the retrieval region approaches low load values.
Figure 3
Figure 3
Examples of typical graphs obtained for different values of c and α; in any case the size is N=4000. Left: a picture corresponding to the under-percolated regime (α=0.4,c=1). Middle: a picture corresponding to the percolation threshold (α=c=1). Right: a picture corresponding to the over-percolated regime (α=0.1,c=5). Notice that isolated nodes are not depicted.
Figure 4
Figure 4
The phase diagram is depicted for different choices of t, namely, from left to right, t=0,1,1000. Notice that, as t grows, the retrieval region (blue) and the ergodic region (yellow) become larger and larger while the spin-glass region (red) shrinks up to collapse as t. A change in the concavity of the critical line separating the ergodic region and the spin-glass region is also observed.

References

    1. Amit D.J. Modeling Brain Function: The World of Attractor Neural Networks. Cambridge University Press; Cambridge, UK: 1992.
    1. Coolen A.C.C., Kuhn R., Sollich P. Theory of Neural Information Processing Systems. OUP Oxford; Oxford, UK: 2005.
    1. Hebb D.O. The Organization of Behavior: A Neuropsychological Theory. Lawrence Erlbaum; New Jersey, NJ, USA: 2002.
    1. Decelle A., Furtlehner C. Restricted Boltzmann Machine, recent advances and mean-field theory. arXiv. 20202011.11307
    1. Ackley D.H., Hinton G.E., Sejnowski T.J. A learning algorithm for Boltzmann machines. Cogn. Sci. 1985;9:147–169. doi: 10.1207/s15516709cog0901_7. - DOI

LinkOut - more resources