Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2024 Mar 6;15(1):2056.
doi: 10.1038/s41467-024-45187-1.

Emerging opportunities and challenges for the future of reservoir computing

Affiliations
Review

Emerging opportunities and challenges for the future of reservoir computing

Min Yan et al. Nat Commun. .

Erratum in

Abstract

Reservoir computing originates in the early 2000s, the core idea being to utilize dynamical systems as reservoirs (nonlinear generalizations of standard bases) to adaptively learn spatiotemporal features and hidden patterns in complex time series. Shown to have the potential of achieving higher-precision prediction in chaotic systems, those pioneering works led to a great amount of interest and follow-ups in the community of nonlinear dynamics and complex systems. To unlock the full capabilities of reservoir computing towards a fast, lightweight, and significantly more interpretable learning framework for temporal dynamical systems, substantially more research is needed. This Perspective intends to elucidate the parallel progress of mathematical theory, algorithm design and experimental realizations of reservoir computing, and identify emerging opportunities as well as existing challenges for large-scale industrial adoption of reservoir computing, together with a few ideas and viewpoints on how some of those challenges might be resolved with joint efforts by academic and industrial researchers across multiple disciplines.

PubMed Disclaimer

Conflict of interest statement

The authors declare no competing interests.

Figures

Fig. 1
Fig. 1. Selected research milestones of RC encompassing system and algorithm designs, representing theory, experimental realizations as well as applications.
For each category a selection of the representative publications were highlighted.
Fig. 2
Fig. 2. Example applications of RC.
Flow diagrams showing how RC is applied in different types of applications, here referring to as signal classification, nonlinear time series prediction, dynamical control and PDE computing, respectively. a RC for spoken-digit recognition,,,,,,,–, when the targets are a vector of digit numbers corresponding to 0–9. b RC for time series prediction with Mackey-Glass equations,,,, as an example. In method 1 with off-line training, the training sequence starts with the first point (black point), while the target sequence starts with the second one (orange point). In method 2 with on-line retraining, the training and testing are alternately presented. c RC acts as the prediction optimizer in the general model predictive control (MPC) framework. Top: The MPC diagram. Bottom: How RC works in the MPC system. d RC for PDE computation,, with the Kuramoto-Sivashinsky (KS) equations as an example. The hidden layer consists of parallel multiple reservoirs, and each of them deal with part of the input data, while a nonlinear transformation is typically inserted before training the parameters of the readout layer.
Fig. 3
Fig. 3. Trends in RC performance in typical application scenarios.
Four kinds of representative scenarios are: a signal classification tasks such as spoken-digit recognition, nonlinear channel equation and optical channel equalization; b time series prediction such as predicting the dynamics of Mackey-Glass equations, Lorenz systems as well as and Santa Fe chaotic time series; c control tasks and d PDE computation. Thick, up-pointing arrows in the panels denote error values that are not directly comparable with other works.
Fig. 4
Fig. 4. Application domains in which RC potentially can play important roles.
Each domain corresponds to three specific example application scenarios. Six domains are 6G, Next Generation (NG) Optical Networks,,–, Internet of Things (IoT),,, Green Data Center,,, Intelligent Robots and AI for Science and Digital Twins,,–.

References

    1. Graves, A., Mohamed, A. R. & Hinton, G. Speech recognition with deep recurrent neural networks. In IEEE International Conference on Acoustics, Speech and Signal Processing, 6645–6649 (IEEE, 2013).
    1. LeCun, Y., Bengio, Y. & Hinton, G. E. Deep learning. Nature521, 436–444 (2015). - PubMed
    1. He, K., Zhang, X., Ren, S. & Sun, J. Deep residual learning for image recognition. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 770–778 (IEEE, 2016).
    1. Silver, D. et al. Mastering the game of go with deep neural networks and tree search. Nature529, 484–489 (2016). - PubMed
    1. Krizhevsky, A., Sutskever, I. & Hinton, G. E. Imagenet classification with deep convolutional neural networks. Commun. ACM60, 84–90 (2017).