Enhanced AlexNet with Gabor and Local Binary Pattern Features for Improved Facial Emotion Recognition
- PMID: 40573719
- PMCID: PMC12197208
- DOI: 10.3390/s25123832
Enhanced AlexNet with Gabor and Local Binary Pattern Features for Improved Facial Emotion Recognition
Abstract
Facial emotion recognition (FER) is vital for improving human-machine interactions, serving as the foundation for AI systems that integrate cognitive and emotional intelligence. This helps bridge the gap between mechanical processes and human emotions, enhancing machine engagement with humans. Considering the constraints of low hardware specifications often encountered in real-world applications, this study leverages recent advances in deep learning to propose an enhanced model for FER. The model effectively utilizes texture information from faces through Gabor and Local Binary Pattern (LBP) feature extraction techniques. By integrating these features into a specially modified AlexNet architecture, our approach not only classifies facial emotions more accurately but also demonstrates significant improvements in performance and adaptability under various operational conditions. To validate the effectiveness of our proposed model, we conducted evaluations using the FER2013 and RAF-DB benchmark datasets, where it achieved impressive accuracies of 98.10% and 93.34% for the two datasets, with standard deviations of 1.63% and 3.62%, respectively. On the FER-2013 dataset, the model attained a precision of 98.2%, a recall of 97.9%, and an F1-score of 98.0%. Meanwhile, for the other dataset, it achieved a precision of 93.54%, a recall of 93.12%, and an F1-score of 93.34%. These results underscore the model's robustness and its capability to deliver high-precision emotion recognition, making it an ideal solution for deployment in environments where hardware limitations are a critical concern.
Keywords: AlexNet; deep learning; emotion recognition; feature extraction.
Conflict of interest statement
The authors declare no conflicts of interest.
Figures







Similar articles
-
Facial Landmark-Driven Keypoint Feature Extraction for Robust Facial Expression Recognition.Sensors (Basel). 2025 Jun 16;25(12):3762. doi: 10.3390/s25123762. Sensors (Basel). 2025. PMID: 40573649 Free PMC article.
-
Facial Emotion Recognition of 16 Distinct Emotions From Smartphone Videos: Comparative Study of Machine Learning and Human Performance.J Med Internet Res. 2025 Jul 2;27:e68942. doi: 10.2196/68942. J Med Internet Res. 2025. PMID: 40601921 Free PMC article.
-
A deep learning approach to direct immunofluorescence pattern recognition in autoimmune bullous diseases.Br J Dermatol. 2024 Jul 16;191(2):261-266. doi: 10.1093/bjd/ljae142. Br J Dermatol. 2024. PMID: 38581445
-
New Trends in Emotion Recognition Using Image Analysis by Neural Networks, A Systematic Review.Sensors (Basel). 2023 Aug 10;23(16):7092. doi: 10.3390/s23167092. Sensors (Basel). 2023. PMID: 37631629 Free PMC article.
-
Systematic Review of Emotion Detection with Computer Vision and Deep Learning.Sensors (Basel). 2024 May 28;24(11):3484. doi: 10.3390/s24113484. Sensors (Basel). 2024. PMID: 38894274 Free PMC article.
References
-
- Li S., Deng W. Deep Facial Expression Recognition: A Survey. IEEE Trans. Affect. Comput. 2022;13:1195–1215. doi: 10.1109/TAFFC.2020.2981446. - DOI
-
- Kołakowska A., Landowska A., Szwoch M., Szwoch W., Wróbel M.R. Emotion Recognition and Its Applications. In: Hippe Z., Kulikowski J., Mroczek T., Wtorek J., editors. Human-Computer Systems Interaction: Backgrounds and Applications 3. Volume 300. Springer; Cham, Switzerland: 2014. Advances in Intelligent Systems and Computing. - DOI
-
- Yi M.-H., Kwak K.-C., Shin J.-H. KoHMT: A Multimodal Emotion Recognition Model Integrating KoELECTRA, HuBERT with Multimodal Transformer. Electronics. 2024;13:4674. doi: 10.3390/electronics13234674. - DOI
-
- Abdusalomov A., Kutlimuratov A., Nasimov R., Whangbo T.K. Improved speech emotion recognition focusing on high-level data representations and swift feature extraction calculation. Comput. Mater. Contin. 2023;77:2915–2933. doi: 10.32604/cmc.2023.044466. - DOI
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
Research Materials
Miscellaneous