Automatic Facial Expression Recognition in Standardized and Non-standardized Emotional Expressions
- PMID: 34025503
- PMCID: PMC8131548
- DOI: 10.3389/fpsyg.2021.627561
Automatic Facial Expression Recognition in Standardized and Non-standardized Emotional Expressions
Abstract
Emotional facial expressions can inform researchers about an individual's emotional state. Recent technological advances open up new avenues to automatic Facial Expression Recognition (FER). Based on machine learning, such technology can tremendously increase the amount of processed data. FER is now easily accessible and has been validated for the classification of standardized prototypical facial expressions. However, applicability to more naturalistic facial expressions still remains uncertain. Hence, we test and compare performance of three different FER systems (Azure Face API, Microsoft; Face++, Megvii Technology; FaceReader, Noldus Information Technology) with human emotion recognition (A) for standardized posed facial expressions (from prototypical inventories) and (B) for non-standardized acted facial expressions (extracted from emotional movie scenes). For the standardized images, all three systems classify basic emotions accurately (FaceReader is most accurate) and they are mostly on par with human raters. For the non-standardized stimuli, performance drops remarkably for all three systems, but Azure still performs similarly to humans. In addition, all systems and humans alike tend to misclassify some of the non-standardized emotional facial expressions as neutral. In sum, emotion recognition by automated facial expression recognition can be an attractive alternative to human emotion recognition for standardized and non-standardized emotional facial expressions. However, we also found limitations in accuracy for specific facial expressions; clearly there is need for thorough empirical evaluation to guide future developments in computer vision of emotional facial expressions.
Keywords: automatic facial coding; facial expression recognition; human emotion recognition; naturalistic expressions; recognition of emotional facial expressions; software evaluation; specific emotions; standardized inventories.
Copyright © 2021 Küntzler, Höfling and Alpers.
Conflict of interest statement
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Figures



References
-
- Abdullah S., Murnane E. L., Costa J. M. R., Choudhury T. (2015). Collective smile: measuring societal happiness from geolocated images, in CSCW '15: Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing, Tallinn, Estonia. 361–374. 10.1145/2675133.2675186 - DOI
-
- Aggarwal A., Lohia P., Nagar S., Dey K., Saha D. (2019). Black box fairness testing of machine learning models, in ESEC/FSE 2019: Proceedings of the 2019 27th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering, Tallinn, Estonia. 625–635. 10.1145/3338906.3338937 - DOI
-
- Arriaga O., Valdenegro-Toro M., Plöger P. (2017). Real-time convolutional neural networks for emotion and gender classification. CoRR, abs/1710.07557.
-
- Bartkiene E., Steibliene V., Adomaitiene V., Juodeikiene G., Cernauskas D., Lele V., et al. . (2019). Factors affecting consumer food preferences: food taste and depression-based evoked emotional expressions with the use of face reading technology. BioMed Res. Int. 2019:2097415. 10.1155/2019/2097415 - DOI - PMC - PubMed
LinkOut - more resources
Full Text Sources