Bi-directional long short term memory-gated recurrent unit model for Amharic next word prediction
- PMID: 35980997
- PMCID: PMC9387859
- DOI: 10.1371/journal.pone.0273156
Bi-directional long short term memory-gated recurrent unit model for Amharic next word prediction
Abstract
The next word prediction is useful for the users and helps them to write more accurately and quickly. Next word prediction is vital for the Amharic Language since different characters can be written by pressing the same consonants along with different vowels, combinations of vowels, and special keys. As a result, we present a Bi-directional Long Short Term-Gated Recurrent Unit (BLST-GRU) network model for the prediction of the next word for the Amharic Language. We evaluate the proposed network model with 63,300 Amharic sentence and produces 78.6% accuracy. In addition, we have compared the proposed model with state-of-the-art models such as LSTM, GRU, and BLSTM. The experimental result shows, that the proposed network model produces a promising result.
Conflict of interest statement
The authors have declared that no competing interests exist.
Figures






References
-
- Selvi Kanimozhi, Ramya S., "Recurrent Neural Network based Models for Word Prediction," International Journal of Recent Technology and Engineering (IJRTE), vol. 8, no. 4, pp. 7433–7437, 2019.
-
- Demeke Girma A., "The Ethio-Semitic Languages (Re-examining the Classification)," Journal of Ethiopian Studies, vol. 34, no. 2, pp. 57–93, 2001.
-
- Salawu Abiodun, Aseres Asemahagn, "Language policy, ideologies, power and the Ethiopian media," South African Journal for Communication Theory and Research, vol. 41, no. 1, pp. 71–89, 2015.
-
- Gereme Fantahun, Zhu William, Ayall Tewodros, Alemu Dagmawi, "Combating Fake News in “Low-Resource” Languages: Amharic Fake News Detection Accompanied by Resource Crafting," Information, vol. 12, no. 1, pp. 1–20, 2021.
MeSH terms
LinkOut - more resources
Full Text Sources
Research Materials