STEM exam performance: Open- versus closed-book methods in the large language model era
- PMID: 39496553
- PMCID: PMC11663729
- DOI: 10.1111/tct.13839
STEM exam performance: Open- versus closed-book methods in the large language model era
Abstract
Background: The COVID-19 pandemic accelerated the shift to remote learning, heightening scrutiny of open-book examinations (OBEs) versus closed-book examinations (CBEs) within science, technology, engineering, arts and mathematics (STEM) education. This study evaluates the efficacy of OBEs compared to CBEs on student performance and perceptions within STEM subjects, considering the emerging influence of sophisticated large language models (LLMs) such as GPT-3.
Methods: Adhering to PRISMA guidelines, this systematic review analysed peer-reviewed articles published from 2013, focusing on the impact of OBEs and CBEs on university STEM students. Standardised mean differences were assessed using a random effects model, with heterogeneity evaluated by I2 statistics, Cochrane's Q test and Tau statistics.
Results: Analysis of eight studies revealed mixed outcomes. Meta-analysis showed that OBEs generally resulted in better scores than CBEs, despite significant heterogeneity (I2 = 97%). Observational studies displayed more pronounced effects, with noted concerns over technical difficulties and instances of cheating.
Discussion: Results suggest that OBEs assess competencies more aligned with current educational paradigms than CBEs. However, the emergence of LLMs poses new challenges to OBE validity by simplifying the generation of comprehensive answers, impacting academic integrity and examination fairness.
Conclusions: While OBEs are better suited to contemporary educational needs, the influence of LLMs on their effectiveness necessitates further study. Institutions should prudently consider the competencies assessed by OBEs, particularly in light of evolving technological landscapes. Future research should explore the integrity of OBEs in the presence of LLMs to ensure fair and effective student evaluations.
© 2024 The Author(s). The Clinical Teacher published by Association for the Study of Medical Education and John Wiley & Sons Ltd.
Conflict of interest statement
The authors declare no potential conflicts of interest with respect to the publication, research, and/or authorship of this article.
Figures
References
-
- Theophilides C, Koutselini M. Study behavior in the closed‐book and the open‐book examination: a comparative analysis. Educ Res Eval. 2000;21(1):379–393. Available at: 10.1076/EDRE.6.4.379.6932 - DOI
-
- Agarwal PK, Karpicke JD, Kang SHK, Roediger HL III, McDermott KB. Examining the testing effect with open‐ and closed‐book tests. Appl Cogn Psychol. 2008;22(7):861–876. 10.1002/ACP.1391 - DOI
-
- Dave M, Patel K, Patel N. A systematic review to compare open and closed book examinations in medicine and dentistry. Fac Dent J. 2021;12(4):174–180. 10.1308/rcsfdj.2021.41 - DOI
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
