Emotional Speech Recognition Method Based on Word Transcription
- PMID: 35271083
- PMCID: PMC8915129
- DOI: 10.3390/s22051937
Emotional Speech Recognition Method Based on Word Transcription
Abstract
The emotional speech recognition method presented in this article was applied to recognize the emotions of students during online exams in distance learning due to COVID-19. The purpose of this method is to recognize emotions in spoken speech through the knowledge base of emotionally charged words, which are stored as a code book. The method analyzes human speech for the presence of emotions. To assess the quality of the method, an experiment was conducted for 420 audio recordings. The accuracy of the proposed method is 79.7% for the Kazakh language. The method can be used for different languages and consists of the following tasks: capturing a signal, detecting speech in it, recognizing speech words in a simplified transcription, determining word boundaries, comparing a simplified transcription with a code book, and constructing a hypothesis about the degree of speech emotionality. In case of the presence of emotions, there occurs complete recognition of words and definitions of emotions in speech. The advantage of this method is the possibility of its widespread use since it is not demanding on computational resources. The described method can be applied when there is a need to recognize positive and negative emotions in a crowd, in public transport, schools, universities, etc. The experiment carried out has shown the effectiveness of this method. The results obtained will make it possible in the future to develop devices that begin to record and recognize a speech signal, for example, in the case of detecting negative emotions in sounding speech and, if necessary, transmitting a message about potential threats or riots.
Keywords: affective computing; artificial intelligence; crowd emotion recognition; distance learning; e-learning; emotion recognition; speech recognition.
Conflict of interest statement
The authors declare no conflict of interest.
Figures
References
-
- Franzoni V., Milani A., Nardi D., Vallverdú J. Emotional machines: The next revolution. Web Intell. 2019;17:1–7. doi: 10.3233/WEB-190395. - DOI
-
- Majumder N., Poria S., Hazarika D., Mihalcea R., Gelbukh A., Cambria E. DialogueRNN: An attentive RNN for emotion detection in conversations; Proceedings of the AAAI Conference on Artificial Intelligence; Honolulu, HI, USA. 27 January–1 February 2019; Palo Alto, CA, USA: AAAI Press; 2019.
-
- Biondi G., Franzoni V., Poggioni V. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) Volume 10406. Springer; Cham, Swizerland: 2017. A deep learning semantic approach to emotion recognition using the IBM watson bluemix alchemy language; pp. 719–729. - DOI
-
- Stappen L., Baird A., Cambria E., Schuller B.W., Cambria E. Sentiment Analysis and Topic Recognition in Video Transcriptions. IEEE Intell. Syst. 2021;36:88–95. doi: 10.1109/MIS.2021.3062200. - DOI
-
- Yang D., Alsadoon A., Prasad P.W.C., Singh A.K., Elchouemi A. An Emotion Recognition Model Based on Facial Recognition in Virtual Learning Environment. Procedia Comput. Sci. 2018;125:2–10. doi: 10.1016/j.procs.2017.12.003. - DOI
MeSH terms
LinkOut - more resources
Full Text Sources
Medical