Overview of Spiking Neural Network Learning Approaches and Their Computational Complexities
- PMID: 36991750
- PMCID: PMC10053242
- DOI: 10.3390/s23063037
Overview of Spiking Neural Network Learning Approaches and Their Computational Complexities
Abstract
Spiking neural networks (SNNs) are subjects of a topic that is gaining more and more interest nowadays. They more closely resemble actual neural networks in the brain than their second-generation counterparts, artificial neural networks (ANNs). SNNs have the potential to be more energy efficient than ANNs on event-driven neuromorphic hardware. This can yield drastic maintenance cost reduction for neural network models, as the energy consumption would be much lower in comparison to regular deep learning models hosted in the cloud today. However, such hardware is still not yet widely available. On standard computer architectures consisting mainly of central processing units (CPUs) and graphics processing units (GPUs) ANNs, due to simpler models of neurons and simpler models of connections between neurons, have the upper hand in terms of execution speed. In general, they also win in terms of learning algorithms, as SNNs do not reach the same levels of performance as their second-generation counterparts in typical machine learning benchmark tasks, such as classification. In this paper, we review existing learning algorithms for spiking neural networks, divide them into categories by type, and assess their computational complexity.
Keywords: computational complexity; hardware; learning algorithms; spiking neural networks.
Conflict of interest statement
The authors declare no conflict of interest.
Figures
Similar articles
-
Rethinking the performance comparison between SNNS and ANNS.Neural Netw. 2020 Jan;121:294-307. doi: 10.1016/j.neunet.2019.09.005. Epub 2019 Sep 19. Neural Netw. 2020. PMID: 31586857
-
Advancements in Algorithms and Neuromorphic Hardware for Spiking Neural Networks.Neural Comput. 2022 May 19;34(6):1289-1328. doi: 10.1162/neco_a_01499. Neural Comput. 2022. PMID: 35534005 Review.
-
Deep learning in spiking neural networks.Neural Netw. 2019 Mar;111:47-63. doi: 10.1016/j.neunet.2018.12.002. Epub 2018 Dec 18. Neural Netw. 2019. PMID: 30682710 Review.
-
Design Space Exploration of Hardware Spiking Neurons for Embedded Artificial Intelligence.Neural Netw. 2020 Jan;121:366-386. doi: 10.1016/j.neunet.2019.09.024. Epub 2019 Sep 26. Neural Netw. 2020. PMID: 31593842
-
Event-driven implementation of deep spiking convolutional neural networks for supervised classification using the SpiNNaker neuromorphic platform.Neural Netw. 2020 Jan;121:319-328. doi: 10.1016/j.neunet.2019.09.008. Epub 2019 Sep 24. Neural Netw. 2020. PMID: 31590013
Cited by
-
Developing a brain inspired multilobar neural networks architecture for rapidly and accurately estimating concrete compressive strength.Sci Rep. 2025 Jan 15;15(1):1989. doi: 10.1038/s41598-024-84325-z. Sci Rep. 2025. PMID: 39814764 Free PMC article.
-
The computational power of the human brain.Front Cell Neurosci. 2023 Aug 7;17:1220030. doi: 10.3389/fncel.2023.1220030. eCollection 2023. Front Cell Neurosci. 2023. PMID: 37608987 Free PMC article. Review.
-
Brain-inspired biomimetic robot control: a review.Front Neurorobot. 2024 Aug 19;18:1395617. doi: 10.3389/fnbot.2024.1395617. eCollection 2024. Front Neurorobot. 2024. PMID: 39224906 Free PMC article. Review.
-
A Novel Robotic Controller Using Neural Engineering Framework-Based Spiking Neural Networks.Sensors (Basel). 2024 Jan 12;24(2):491. doi: 10.3390/s24020491. Sensors (Basel). 2024. PMID: 38257584 Free PMC article.
-
Moiré synaptic transistor with room-temperature neuromorphic functionality.Nature. 2023 Dec;624(7992):551-556. doi: 10.1038/s41586-023-06791-1. Epub 2023 Dec 20. Nature. 2023. PMID: 38123805
References
-
- Wang C.-Y., Bochkovskiy A., Liao H.-Y.M. YOLOv7: Trainable Bag-of-Freebies Sets New State-of-the-Art for Real-Time Object Detectors. arXiv. 20222207.02696
-
- Ronneberger O., Fischer P., Brox T. Medical Image Computing and Computer-Assisted Intervention (MICCAI) Volume 9351. Springer; Cham, Switzerland: 2015. U-Net: Convolutional networks for biomedical image segmentation; pp. 234–241. LNCS.
-
- Brown T., Mann B., Ryder N., Subbiah M., Kaplan J.D., Dhariwal P., Neelakantan A., Shyam P., Sastry G., Askell A., et al. Language Models Are Few-Shot Learners. In: Larochelle H., Ranzato M., Hadsell R., Balcan M.F., Lin H., editors. Advances in Neural Information Processing Systems. Volume 33. Curran Associates, Inc.; Red Hook, NJ, USA: 2020. pp. 1877–1901.
-
- LeCun Y., Bottou L., Bengio Y., Ha P. Gradient-Based Learning Applied to Document Recognition. Proc. IEEE. 1998;86:2278–2324. doi: 10.1109/5.726791. - DOI
-
- Krizhevsky A. Learning Multiple Layers of Features from Tiny Images. [(accessed on 6 February 2023)]. Available online: https://www.cs.toronto.edu/kriz/learning-features-2009-TR.pdf.
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
Miscellaneous