Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2025 Sep 29;15(1):33654.
doi: 10.1038/s41598-025-17769-6.

A comparative analysis and noise robustness evaluation in quantum neural networks

Affiliations

A comparative analysis and noise robustness evaluation in quantum neural networks

Tasnim Ahmed et al. Sci Rep. .

Abstract

In current noisy intermediate-scale quantum (NISQ) devices, hybrid quantum neural networks (HQNNs) offer a promising solution, combining the strengths of classical machine learning with quantum computing capabilities. However, the performance of these networks can be significantly affected by the quantum noise inherent in NISQ devices. In this paper, we conduct an extensive comparative analysis of various HQNN algorithms, namely Quantum Convolution Neural Network (QCNN), Quanvolutional Neural Network (QuanNN), and Quantum Transfer Learning (QTL), for image classification tasks. We evaluate the performance of each algorithm across quantum circuits with different entangling structures, variations in layer count, and optimal placement in the architecture. Subsequently, we select the highest-performing architectures and assess their robustness against noise influence by introducing quantum gate noise through Phase Flip, Bit Flip, Phase Damping, Amplitude Damping, and the Depolarization Channel. Our results reveal that the top-performing models exhibit varying resilience to different noise channels. However, in most scenarios, the QuanNN demonstrates greater robustness across various quantum noise channels, consistently outperforming other models. This highlights the importance of tailoring model selection to specific noise environments in NISQ devices.

PubMed Disclaimer

Conflict of interest statement

Declarations. Competing interests: The authors declare no competing interests.

Figures

Fig. 1
Fig. 1
Motivational Case study. a Two different variants of HQNN (QCNN and QuanNN) yield different performance with the same underlying architecture of quantum layers, b Different effects of Bit Flip noise on the performance of QCNN and QuanNN highlight unique noise sensitivities of different HQNNs.
Fig. 2
Fig. 2
Architecture overview of Selected HQNN algorithms. Each model utilizes a classical fully connected layer to transform quantum circuit measurement into classification probabilities. In the QCNN, classical convolutional and pooling layers are used for image downsizing to match the qubit count of a circuit.
Fig. 3
Fig. 3
Our methodology. A comprehensive comparative analysis of three different variants of HQNNs is performed with different configurations of quantum layers mainly differing in degree of entanglement, rotation gates, and number of layers (depth of quantum layers). The odd and even depth in a strongly entangling configuration denotes how the layer is repeated when the number of layers are increased. Based on the obtained results the best-performing models with corresponding best configurations are shortlisted which then undergo training under the influence of different types of quantum errors/noise across a wide range of probabilities of each noise type. The comparative analysis of ideal and noisy scenarios is then performed to test the noise-robustness of different HQNN variants. The evaluation metrics used for all the experiments are training and validation accuracy.
Fig. 4
Fig. 4
Comparative analysis of QCNN, QuanNN, and QTL in noise-free settings. The HQNN models are tested with different configurations of quantum layers. QuanNN turns out to be the best HQNN variant in terms of both better performance than other variants and also robustness against quantum layer’s variations. the second best HQNN model is QCNN, which is sensitive to the degree of entanglement in the underlying quantum layers, i.e., the more the better. QTL consistently performs poorly among the three HQNN variants regardless of the quantum layer configuration.
Fig. 5
Fig. 5
Comparison of 8 qubit QCNN model performance in noise-free and under Amplitude Damping, Bit Flip, Depolarization, Phase Damping and Phase Flip channels with probabilities of 0.1, 0.5 and 1.0.
Fig. 6
Fig. 6
Comparison of QCNN performance in noise free and under Amplitude Damping channel with different probabilities.
Fig. 7
Fig. 7
Comparison of QCNN performance in noise free and under Bit Flip channel with different probabilities.
Fig. 8
Fig. 8
Comparison of QCNN performance in noise free and under Depolarization channel with different probabilities.
Fig. 9
Fig. 9
Comparison of QCNN performance in noise free and under Phase Damping channel with different probabilities.
Fig. 10
Fig. 10
Comparison of QCNN performance in noise free and under Phase Flip channel with different probabilities.
Fig. 11
Fig. 11
Comparison of QuanNN performance in noise free and under Amplitude Damping channel with different probabilities.
Fig. 12
Fig. 12
Comparison of QuanNN performance in noise free and under Bit Flip channel with different probabilities.
Fig. 13
Fig. 13
Comparison of QuanNN performance in noise free and under Depolarization channel with different probabilities.
Fig. 14
Fig. 14
Comparison of QuanNN performance in noise free and under Phase Damping channel with different probabilities.
Fig. 15
Fig. 15
Comparison of QuanNN performance in noise free and under Phase Flip channel with different probabilities.

References

    1. Arute, F. et al. Quantum supremacy using a programmable superconducting processor. Nature10.1038/s41586-019-1666-5 (2019). - PubMed
    1. Kim, Y. et al. Evidence for the utility of quantum computing before fault tolerance. Nature618, 505–510. 10.1038/s41586-023-06096-3 (2023). - PMC - PubMed
    1. Zhong, H.-S. et al. Quantum computational advantage using photons. Science370, 1460–1463. 10.1126/science.abe8770 (2020). - PubMed
    1. Schuld, M., Sinayskiy, I. & Petruccione, F. The quest for a quantum neural network. Quantum Inf. Process10.1007/s11128-014-0809-8 (2014).
    1. Schuld, M. & Killoran, N. Quantum machine learning in feature Hilbert spaces. Phys. Rev. Lett.10.1103/physrevlett.122.040504 (2019). - PubMed

LinkOut - more resources