Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Jul 6;15(1):5676.
doi: 10.1038/s41467-024-49877-8.

Shadows of quantum machine learning

Affiliations

Shadows of quantum machine learning

Sofiene Jerbi et al. Nat Commun. .

Abstract

Quantum machine learning is often highlighted as one of the most promising practical applications for which quantum computers could provide a computational advantage. However, a major obstacle to the widespread use of quantum machine learning models in practice is that these models, even once trained, still require access to a quantum computer in order to be evaluated on new data. To solve this issue, we introduce a class of quantum models where quantum resources are only required during training, while the deployment of the trained model is classical. Specifically, the training phase of our models ends with the generation of a 'shadow model' from which the classical deployment becomes possible. We prove that: (i) this class of models is universal for classically-deployed quantum machine learning; (ii) it does have restricted learning capacities compared to 'fully quantum' models, but nonetheless (iii) it achieves a provable learning advantage over fully classical learners, contingent on widely believed assumptions in complexity theory. These results provide compelling evidence that quantum machine learning can confer learning advantages across a substantially broader range of scenarios, where quantum computers are exclusively employed during the training phase. By enabling classical deployment, our approach facilitates the implementation of quantum machine learning models in various practical contexts.

PubMed Disclaimer

Conflict of interest statement

The authors declare no competing interests.

Figures

Fig. 1
Fig. 1. Quantum and shadow models.
(left) Conventional quantum models can be expressed as inner products between a data-encoding quantum state ρ(x) and a parametrized observable O(θ). The resulting linear model fθ(x)=Tr[ρ(x)O(θ)] naturally corresponds to a quantum computation, depicted here. (middle) We define flipped models fθ(x)=Tr[ρ(θ)O(x)] as quantum linear models where the role of the quantum state ρ(θ) and the observable O(x) is flipped compared to conventional models. (right) Flipped models are associated to natural shadow models: one can use techniques from shadow tomography to construct a classical representation ρ^(θ) of the parametrized state ρ(θ) (during the shadowing phase), such that, for encoding observables O(x) that are classically representable (e.g., linear combinations of Pauli observables), ρ^(θ) can be used by a classical algorithm to evaluate the model fθ(x) on new input data (during the evaluation phase). More generally, a shadow model is defined by (i) a shadowing phase where a (bit-string) advice ω(θ) is generated by the evaluation of multiple quantum circuits W1(θ), …, WM(θ), and (ii) an evaluation phase where this advice is used by a classical algorithm A, along with new input data x to evaluate their labels f~θ(x). In the Section “General shadow models”, we show that under this general definition, all shadow models are shadows of flipped models.
Fig. 2
Fig. 2. Separations between classical, shadow, and quantum models.
Under the assumption that the discrete cube root (DCR) cannot be computed classically in polynomial time, we have a separation between shadow models (captured by the class BPP/qgenpoly) and classical models (in BPP). Under the assumption that there exist functions that can be computed in quantum polynomial time but not in classical polynomial time with the help of advice (i.e., BQP ⊄ P/poly), we have a separation between quantum models (universal for BQP) and shadow models (BPP/qgenpoly). A candidate function for this separation is the discrete logarithm (DLP).

References

    1. Biamonte J, et al. Quantum machine learning. Nature. 2017;549:195. doi: 10.1038/nature23474. - DOI - PubMed
    1. Dunjko V, Briegel HJ. Machine learning & artificial intelligence in the quantum domain: a review of recent progress. Rep. Prog. Phys. 2018;81:074001. doi: 10.1088/1361-6633/aab406. - DOI - PubMed
    1. Schuld, M. & Petruccione, F. Supervised Learning With Quantum Computers 1st edn, Vol. 287 (Springer, 2018).
    1. Benedetti M, Lloyd E, Sack S, Fiorentini M. Parameterized quantum circuits as machine learning models. Quant. Sci. Technol. 2019;4:043001. doi: 10.1088/2058-9565/ab4eb5. - DOI
    1. Cerezo M, et al. Variational quantum algorithms. Nat. Rev. Phys. 2021;3:625. doi: 10.1038/s42254-021-00348-9. - DOI

LinkOut - more resources