Transfer-RLS method and transfer-FORCE learning for simple and fast training of reservoir computing models
- PMID: 34304003
- DOI: 10.1016/j.neunet.2021.06.031
Transfer-RLS method and transfer-FORCE learning for simple and fast training of reservoir computing models
Abstract
Reservoir computing is a machine learning framework derived from a special type of recurrent neural network. Following recent advances in physical reservoir computing, some reservoir computing devices are thought to be promising as energy-efficient machine learning hardware for real-time information processing. To realize efficient online learning with low-power reservoir computing devices, it is beneficial to develop fast convergence learning methods with simpler operations. This study proposes a training method located in the middle between the recursive least squares (RLS) method and the least mean squares (LMS) method, which are standard online learning methods for reservoir computing models. The RLS method converges fast but requires updates of a huge matrix called a gain matrix, whereas the LMS method does not use a gain matrix but converges very slow. On the other hand, the proposed method called a transfer-RLS method does not require updates of the gain matrix in the main-training phase by updating that in advance (i.e., in a pre-training phase). As a result, the transfer-RLS method can work with simpler operations than the original RLS method without sacrificing much convergence speed. We numerically and analytically show that the transfer-RLS method converges much faster than the LMS method. Furthermore, we show that a modified version of the transfer-RLS method (called transfer-FORCE learning) can be applied to the first-order reduced and controlled error (FORCE) learning for a reservoir computing model with a closed-loop, which is challenging to train.
Keywords: FORCE learning; Online supervised learning; Recurrent neural networks; Recursive least squares method; Reservoir computing.
Copyright © 2021 The Authors. Published by Elsevier Ltd.. All rights reserved.
Conflict of interest statement
Declaration of Competing Interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Similar articles
-
Recent advances in physical reservoir computing: A review.Neural Netw. 2019 Jul;115:100-123. doi: 10.1016/j.neunet.2019.03.005. Epub 2019 Mar 20. Neural Netw. 2019. PMID: 30981085 Review.
-
Efficient Approach for RLS Type Learning in TSK Neural Fuzzy Systems.IEEE Trans Cybern. 2017 Sep;47(9):2343-2352. doi: 10.1109/TCYB.2016.2638861. Epub 2016 Dec 29. IEEE Trans Cybern. 2017. PMID: 28055939
-
A Reservoir Computing Model of Reward-Modulated Motor Learning and Automaticity.Neural Comput. 2019 Jul;31(7):1430-1461. doi: 10.1162/neco_a_01198. Epub 2019 May 21. Neural Comput. 2019. PMID: 31113300
-
A pruning method for the recursive least squared algorithm.Neural Netw. 2001 Mar;14(2):147-74. doi: 10.1016/s0893-6080(00)00093-9. Neural Netw. 2001. PMID: 11316231
-
A Survey of Stochastic Computing Neural Networks for Machine Learning Applications.IEEE Trans Neural Netw Learn Syst. 2021 Jul;32(7):2809-2824. doi: 10.1109/TNNLS.2020.3009047. Epub 2021 Jul 6. IEEE Trans Neural Netw Learn Syst. 2021. PMID: 32755867 Review.
Cited by
-
Tipping Point Detection Using Reservoir Computing.Research (Wash D C). 2023 Jul 3;6:0174. doi: 10.34133/research.0174. eCollection 2023. Research (Wash D C). 2023. PMID: 37404384 Free PMC article.
-
Taming Prolonged Ionic Drift-Diffusion Dynamics for Brain-Inspired Computation.Adv Mater. 2025 Jan;37(3):e2407326. doi: 10.1002/adma.202407326. Epub 2024 Nov 27. Adv Mater. 2025. PMID: 39600216 Free PMC article.
MeSH terms
LinkOut - more resources
Full Text Sources
Research Materials