Student behavior analysis to measure engagement levels in online learning environments
- PMID: 34007342
- PMCID: PMC8119613
- DOI: 10.1007/s11760-021-01869-7
Student behavior analysis to measure engagement levels in online learning environments
Abstract
After the COVID-19 pandemic, no one refutes the importance of smart online learning systems in the educational process. Measuring student engagement is a crucial step towards smart online learning systems. A smart online learning system can automatically adapt to learners' emotions and provide feedback about their motivations. In the last few decades, online learning environments have generated tremendous interest among researchers in computer-based education. The challenge that researchers face is how to measure student engagement based on their emotions. There has been an increasing interest towards computer vision and camera-based solutions as technology that overcomes the limits of both human observations and expensive equipment used to measure student engagement. Several solutions have been proposed to measure student engagement, but few are behavior-based approaches. In response to these issues, in this paper, we propose a new automatic multimodal approach to measure student engagement levels in real time. Thus, to offer robust and accurate student engagement measures, we combine and analyze three modalities representing students' behaviors: emotions from facial expressions, keyboard keystrokes, and mouse movements. Such a solution operates in real time while providing the exact level of engagement and using the least expensive equipment possible. We validate the proposed multimodal approach through three main experiments, namely single, dual, and multimodal research modalities in novel engagement datasets. In fact, we build new and realistic student engagement datasets to validate our contributions. We record the highest accuracy value (95.23%) for the multimodal approach and the lowest value of "0.04" for mean square error (MSE).
Keywords: Academic facial emotions; Affective model; Convolutional neural network (CNN); Engagement level; Keyboard and mouse behaviors.
© The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2021.
Conflict of interest statement
Competing interestThe authors declare that they have no conflict of interest.
Figures
References
-
- D’Errico, F., Paciello, M., Cerniglia, L.: When emotions enhance students’ engagement in e-learning processes. J e-Learn Knowl Soc, pp 9–23 (2016)
-
- Jang, M., Park, C., Yang, H.S.: Building an automated engagement recognizer based on video analysis. In: 2014 9th ACM/IEEE International Conference on Human–Robot Interaction (HRI), pp.182–183 (2014)
-
- Monkaresi H, Bosch N, Calvo RA, D’Mello SK. Automated detection of engagement using video-based estimation of facial expressions and heart rate. IEEE Trans. Affect. Comput. 2016;8(1):15–28. doi: 10.1109/TAFFC.2016.2515084. - DOI
-
- Whitehill J, Serpell Z, Lin YC, Foster A, Movellan JR. The faces of engagement: Automatic recognition of student engagement from facial expressions. IEEE Trans. Affect. Comput. 2014;5(1):86–98. doi: 10.1109/TAFFC.2014.2316163. - DOI
-
- You X, Xu J, Yuan W, et al. Multi-view common component discriminant analysis for cross-view classification. Pattern Recogn. 2019;92:37–51. doi: 10.1016/j.patcog.2019.03.008. - DOI
LinkOut - more resources
Full Text Sources
Other Literature Sources