Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2017 Apr 19:7:45672.
doi: 10.1038/srep45672.

Quantum Enhanced Inference in Markov Logic Networks

Affiliations

Quantum Enhanced Inference in Markov Logic Networks

Peter Wittek et al. Sci Rep. .

Abstract

Markov logic networks (MLNs) reconcile two opposing schools in machine learning and artificial intelligence: causal networks, which account for uncertainty extremely well, and first-order logic, which allows for formal deduction. An MLN is essentially a first-order logic template to generate Markov networks. Inference in MLNs is probabilistic and it is often performed by approximate methods such as Markov chain Monte Carlo (MCMC) Gibbs sampling. An MLN has many regular, symmetric structures that can be exploited at both first-order level and in the generated Markov network. We analyze the graph structures that are produced by various lifting methods and investigate the extent to which quantum protocols can be used to speed up Gibbs sampling with state preparation and measurement schemes. We review different such approaches, discuss their advantages, theoretical limitations, and their appeal to implementations. We find that a straightforward application of a recent result yields exponential speedup compared to classical heuristics in approximate probabilistic inference, thereby demonstrating another example where advanced quantum resources can potentially prove useful in machine learning.

PubMed Disclaimer

Conflict of interest statement

The authors declare no competing financial interests.

Figures

Figure 1
Figure 1. An example of a first-order knowledge base, a matching MLN, and the corresponding concepts of a thermal state and a local Hamiltonian.
The knowledge base has only two formulas, and the variables range over a finite domain of two elements, {A, B}. Grounding out all formulas in all possible way, we obtain the MLN of maximal size (i.e., lifted inference is not used). The maximum of absolute value of the weights w1 and w2 defines the inverse temperature β in the thermal state. Since all ground atoms are binary valued, the local space is formula image, and thus the thermal state formula image is in formula image, where n is the total number of nodes.

References

    1. Koller D., Friedman N., Getoor L. & Taskar B. Graphical models in a nutshell. In Getoor L. & Taskar B. (eds.) Introduction to Statistical Relational Learning (MIT Press, 2007).
    1. Richardson M. & Domingos P. Markov logic networks. Machine Learning 62, 107–136, doi: 10.1007/s10994-006-5833-1 (2006). - DOI
    1. Schuld M., Sinayskiy I. & Petruccione F. An introduction to quantum machine learning. Contemporary Physics 56, 1–14, doi: 10.1080/00107514.2014.964942 (2014). - DOI
    1. Wittek P. Quantum Machine Learning: What Quantum Computing Means to Data Mining (Academic Press, New York, NY, USA, 2014).
    1. Adcock J. et al.. Advances in quantum machine learning. arXiv:1512.02900 (2015).

Publication types

LinkOut - more resources