Continual Sequence Modeling With Predictive Coding
- PMID: 35686118
- PMCID: PMC9171436
- DOI: 10.3389/fnbot.2022.845955
Continual Sequence Modeling With Predictive Coding
Abstract
Recurrent neural networks (RNNs) have been proved very successful at modeling sequential data such as language or motions. However, these successes rely on the use of the backpropagation through time (BPTT) algorithm, batch training, and the hypothesis that all the training data are available at the same time. In contrast, the field of developmental robotics aims at uncovering lifelong learning mechanisms that could allow embodied machines to learn and stabilize knowledge in continuously evolving environments. In this article, we investigate different RNN designs and learning methods, that we evaluate in a continual learning setting. The generative modeling task consists in learning to generate 20 continuous trajectories that are presented sequentially to the learning algorithms. Each method is evaluated according to the average prediction error over the 20 trajectories obtained after complete training. This study focuses on learning algorithms with low memory requirements, that do not need to store past information to update their parameters. Our experiments identify two approaches especially fit for this task: conceptors and predictive coding. We suggest combining these two mechanisms into a new proposed model that we label PC-Conceptors that outperforms the other methods presented in this study.
Keywords: Reservoir Computing (RC); conceptors; continual learning; predictive coding; recurrent neural networks (RNN).
Copyright © 2022 Annabi, Pitti and Quoy.
Conflict of interest statement
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Figures











References
-
- Annabi L., Pitti A., Quoy M. (2021b). “A predictive coding account for chaotic itinerancy,” in Artificial Neural Networks and Machine Learning-ICANN 2021, eds I. Farkaš, P. Masulli, S. Otte, and S. Wermter (Cham: Springer International Publishing: ), 581–592.
-
- Collins J., Sohl-Dickstein J., Sussillo D. (2016). Capacity and trainability in recurrent neural networks. stat 1050:29. - PubMed
-
- Cossu A., Bacciu D., Carta A., Gallicchio C., Lomonaco V. (2021a). Continual learning with echo state networks. arXiv preprint arXiv:2105.07674. 10.14428/esann/2021.ES2021-80 - DOI
LinkOut - more resources
Full Text Sources
Miscellaneous