A critical review of RNN and LSTM variants in hydrological time series predictions
- PMID: 39324077
- PMCID: PMC11422155
- DOI: 10.1016/j.mex.2024.102946
A critical review of RNN and LSTM variants in hydrological time series predictions
Abstract
The rapid advancement in Artificial Intelligence (AI) and big data has developed significance in the water sector, particularly in hydrological time-series predictions. Recurrent Neural Networks (RNNs) and Long Short-Term Memory (LSTM) networks have become research focal points due to their effectiveness in modeling non-linear, time-variant hydrological systems. This review explores the different architectures of RNNs, LSTMs, and Gated Recurrent Units (GRUs) and their efficacy in predicting hydrological time-series data.•RNNs are foundational but face limitations such as vanishing gradients, which impede their ability to model long-term dependencies. LSTMs and GRUs have been developed to overcome these limitations, with LSTMs using memory cells and gating mechanisms, while GRUs provide a more streamlined architecture with similar benefits.•The integration of attention mechanisms and hybrid models that combine RNNs, LSTMs, and GRUs with other Machine learning (ML) and Deep Learning (DL) has improved prediction accuracy by capturing both temporal and spatial dependencies.•Despite their effectiveness, practical implementations of these models in hydrological time series prediction require extensive datasets and substantial computational resources. Future research should develop interpretable architectures, enhance data quality, incorporate domain knowledge, and utilize transfer learning to improve model generalization and scalability across diverse hydrological contexts.
Keywords: Artificial intelligence; Deep learning; Hydrological prediction; Long short-term memory; RNN and LSTM variants for Time Series Prediction.; Recurrent neural networks.
© 2024 The Author(s).
Conflict of interest statement
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Figures
References
-
- Najafabadi M.M., et al. Deep learning applications and challenges in big data analytics. J. Big Data. 2015;2:1–21.
-
- Schuster M., Paliwal K.K. Bidirectional recurrent neural networks. IEEE Transact. Signal Process. 1997;45(11):2673–2681.
-
- Salehinejad H., et al. 2017. arXiv preprint.
-
- Medsker L.R., Jain L. Recurrent neural networks. Des. Applic. 2001;5(64–67):2.
-
- Grossberg S. Recurrent neural networks. Scholarpedia. 2013;8(2):1888.
Publication types
LinkOut - more resources
Full Text Sources
