Learning Simpler Language Models with the Differential State Framework
- PMID: 28957029
- DOI: 10.1162/neco_a_01017
Learning Simpler Language Models with the Differential State Framework
Abstract
Learning useful information across long time lags is a critical and difficult problem for temporal neural models in tasks such as language modeling. Existing architectures that address the issue are often complex and costly to train. The differential state framework (DSF) is a simple and high-performing design that unifies previously introduced gated neural models. DSF models maintain longer-term memory by learning to interpolate between a fast-changing data-driven representation and a slowly changing, implicitly stable state. Within the DSF framework, a new architecture is presented, the delta-RNN. This model requires hardly any more parameters than a classical, simple recurrent network. In language modeling at the word and character levels, the delta-RNN outperforms popular complex architectures, such as the long short-term memory (LSTM) and the gated recurrent unit (GRU), and, when regularized, performs comparably to several state-of-the-art baselines. At the subword level, the delta-RNN's performance is comparable to that of complex gated architectures.
Similar articles
-
Long short-term memory RNN for biomedical named entity recognition.BMC Bioinformatics. 2017 Oct 30;18(1):462. doi: 10.1186/s12859-017-1868-5. BMC Bioinformatics. 2017. PMID: 29084508 Free PMC article.
-
Character gated recurrent neural networks for Arabic sentiment analysis.Sci Rep. 2022 Jun 13;12(1):9779. doi: 10.1038/s41598-022-13153-w. Sci Rep. 2022. PMID: 35697814 Free PMC article.
-
Entity recognition from clinical texts via recurrent neural network.BMC Med Inform Decis Mak. 2017 Jul 5;17(Suppl 2):67. doi: 10.1186/s12911-017-0468-7. BMC Med Inform Decis Mak. 2017. PMID: 28699566 Free PMC article.
-
Recurrent transform learning.Neural Netw. 2019 Oct;118:271-279. doi: 10.1016/j.neunet.2019.07.003. Epub 2019 Jul 15. Neural Netw. 2019. PMID: 31326661
-
Recurrent Neural Networks (RNNs): Architectures, Training Tricks, and Introduction to Influential Research.2023 Jul 23. In: Colliot O, editor. Machine Learning for Brain Disorders [Internet]. New York, NY: Humana; 2023. Chapter 4. 2023 Jul 23. In: Colliot O, editor. Machine Learning for Brain Disorders [Internet]. New York, NY: Humana; 2023. Chapter 4. PMID: 37988518 Free Books & Documents. Review.
Cited by
-
Forecasting Root-Zone Electrical Conductivity of Nutrient Solutions in Closed-Loop Soilless Cultures via a Recurrent Neural Network Using Environmental and Cultivation Information.Front Plant Sci. 2018 Jun 21;9:859. doi: 10.3389/fpls.2018.00859. eCollection 2018. Front Plant Sci. 2018. PMID: 29977249 Free PMC article.
-
Five Breakthroughs: A First Approximation of Brain Evolution From Early Bilaterians to Humans.Front Neuroanat. 2021 Aug 17;15:693346. doi: 10.3389/fnana.2021.693346. eCollection 2021. Front Neuroanat. 2021. PMID: 34489649 Free PMC article.
-
A Low-Delay Lightweight Recurrent Neural Network (LLRNN) for Rotating Machinery Fault Diagnosis.Sensors (Basel). 2019 Jul 14;19(14):3109. doi: 10.3390/s19143109. Sensors (Basel). 2019. PMID: 31337108 Free PMC article.
-
Multi-Timescale Memory Dynamics Extend Task Repertoire in a Reinforcement Learning Network With Attention-Gated Memory.Front Comput Neurosci. 2018 Jul 12;12:50. doi: 10.3389/fncom.2018.00050. eCollection 2018. Front Comput Neurosci. 2018. PMID: 30061819 Free PMC article.
Publication types
LinkOut - more resources
Full Text Sources
Other Literature Sources