Roadmap on emerging hardware and technology for machine learning
- PMID: 32679577
- PMCID: PMC11411818
- DOI: 10.1088/1361-6528/aba70f
Roadmap on emerging hardware and technology for machine learning
Abstract
Recent progress in artificial intelligence is largely attributed to the rapid development of machine learning, especially in the algorithm and neural network models. However, it is the performance of the hardware, in particular the energy efficiency of a computing system that sets the fundamental limit of the capability of machine learning. Data-centric computing requires a revolution in hardware systems, since traditional digital computers based on transistors and the von Neumann architecture were not purposely designed for neuromorphic computing. A hardware platform based on emerging devices and new architecture is the hope for future computing with dramatically improved throughput and energy efficiency. Building such a system, nevertheless, faces a number of challenges, ranging from materials selection, device optimization, circuit fabrication and system integration, to name a few. The aim of this Roadmap is to present a snapshot of emerging hardware technologies that are potentially beneficial for machine learning, providing the Nanotechnology readers with a perspective of challenges and opportunities in this burgeoning field.
Figures





















References
-
- Hu M. et al. Proceedings of the 53rd Annual Design Automation Conference. Art. No. 19. Austin, Texas. June 05-09, 2016. doi: 10.1145/2897937.2898010 - DOI
-
- Hasler P, Diorio C, Minch BA and C. and Mead A 1994. Single transistor learning synapses Advances Neural Information Processing Systems 7 817–24
-
- Bavandpour M, Mahmoodi MR, Nili H, Merrikh Bayat F, Prezioso M, Vincent A, Strukov D, and Likharev KK 2019. Mixed-signal neuromorphic inference accelerators: Recent results and future prospects IEEE Int. Electron Device Meeting (San Francisco, CA: ) pp. 20.4.1–4
References (separate from the two page limit)
-
- Shulaker MM et al., “Three-dimensional integration of nanotechnologies for computing and data storage on a single chip,” Nature, vol. 547, no. 7661, pp. 74–78, Jul. 2017. - PubMed
-
- Romero LP et al., “Training fully connected networks with resistive memories: impact of device failures,” Faraday Discuss, vol. 213, no. 0, pp. 371–391, Feb. 2019. - PubMed
-
- Vatajelu E, Natale GD, and Anghel L, “Special Session: Reliability of Hardware-Implemented Spiking Neural Networks (SNN),” in 2019 IEEE 37th VLSI Test Symposium (VTS), 2019, pp. 1–8.
-
- Liu C, Hu M, Strachan JP, and Li H, “Rescuing memristor-based neuromorphic design with high defects,” in 2017 54th ACM/EDAC/IEEE Design Automation Conference (DAC), 2017, pp. 1–6.
References
-
- Waser R, Dittmann R, Staikov G, and Szot K, Redox-Based Resistive Switching Memories – Nanionic Mechanisms, Prospects, and Challenges, Advanced Materials 21 2632–2663 (2009). - PubMed
-
- Yang JJ, Strukov DB, and Stewart DR, Memristive Devices for Computing, Nature Nanotechnology 8 13–24 (2013). - PubMed
-
- Sediva E, Bowman WJ, Gonzalez-Rosillo JC, Rupp JLM, Investigation of the Eightwise Switching Mechanism and its Suppression in SrTiO3 Modulated by Humidity and Interchanged Top and Bottom Platinum and LaNiO3 Electrode Contacts, Advanced Electronic Materials 1800566 (2018).
-
- Kubicek M, Schmitt R, Messerschmitt F, Rupp JLM, Uncovering Two Competing Switching Mechanisms for Epitaxial and Ultra-Thin Strontium Titanate-Based Resistive Switching Bits, ACS Nano 9 10737 (2015). - PubMed
Bibliography:
-
- Shainline JM, Buckley SM, McCaughan AN, Chiles J, Castellanos-Beltran M, Donnelly CA, Schneider ML, Jafari-Salim A, Mirin RP, and Nam SW, “Superconducting Optoelectronic Loop Neurons”, J. Appl. Phys. 126 044902 (2019).
-
- Crotty P, Schult D. and Segall K, “Josephson junction simulation of neurons”, Phys. Rev. E. 82 011914 (2010). - PubMed
-
- Segall K, LeGro M, Kaplan S, Svitelskiy O, Khadka S, Crotty P, and Schult D, “Synchronization dynamics on the picosecond time scale in coupled Josephson junction networks”, Phys. Rev. E. 95 032220 (2017). - PubMed
-
- Furber S, “Large-scale neuromorphic computing systems,” J. Neural Eng, vol. 13, no. 5, p. 051001, Aug. 2016. - PubMed
References (separate from the two page limit)
-
- Maass W 1997. Networks of spiking neurons: The third generation of neural network models Neural Netw. 10 1659–71
-
- Xia Q and Yang JJ 2019. Memristive crossbar arrays for brain-inspired computing Nat. Mater. 18 309–23 - PubMed
-
- Furber SB, Galluppi F, Temple S and Plana LA 2014. The SpiNNaker Project Proc. IEEE; 102 652–65
-
- Merolla PA, Arthur JV, Alvarez-Icaza R, Cassidy AS, Sawada J, Akopyan F, Jackson BL, Imam N, Guo C, Nakamura Y, Brezzo B, Vo I, Esser SK, Appuswamy R, Taba B, Amir A, Flickner MD, Risk WP, Manohar R and Modha DS 2014. A million spiking-neuron integrated circuit with a scalable communication network and interface Science 345 668–73 - PubMed
-
- Davies M, Srinivasa N, Lin T-H, Chinya G, Cao Y, Choday SH, Dimou G, Joshi P, Imam N and Jain S 2018. Loihi: A neuromorphic manycore processor with on-chip learning IEEE Micro 38 82–99
Grants and funding
LinkOut - more resources
Full Text Sources
Other Literature Sources