Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Dec 7;22(1):844.
doi: 10.1186/s12909-022-03787-9.

Medical students' perception of changes in assessments implemented during the COVID-19 pandemic

Affiliations

Medical students' perception of changes in assessments implemented during the COVID-19 pandemic

Francesca Bladt et al. BMC Med Educ. .

Abstract

Background: COVID-19 posed many challenges to medical education in the United Kingdom (UK). This includes implementing assessments during 4 months of national lockdowns within a 2-year period, where in-person education was prohibited. This study aimed to identify medical school assessment formats emerging during COVID-19 restrictions, investigate medical students' perspectives on these and identify influencing factors.

Methods: The study consisted of two phases: a questionnaire asking medical students about assessment changes they experienced, satisfaction with these changes and preference regarding different assessments that emerged. The second phase involved semi-structured interviews with medical students across the UK to provide a deeper contextualized understanding of the complex factors influencing their perspectives.

Results: In the questionnaire responses, open-book assessments had the highest satisfaction, and were the preferred option indicated. Furthermore, in the case of assessment cancellation, an increase in weighting of future assessments was preferred over increase in weighting of past assessments. Students were also satisfied with formative or pass-fail assessments. Interview analyses indicate that although cancellation or replacement of summative assessments with formative assessments reduced heightened anxiety from additional COVID-19 stressors, students worried about possible future knowledge gaps resulting from reduced motivation for assessment-related study. Students' satisfaction level was also affected by timeliness of communication from universities regarding changes, and student involvement in the decision-making processes. Perceived fairness and standardisation of test-taking conditions were ranked as the most important factors influencing student satisfaction, followed closely by familiarity with the format. In contrast, technical issues, lack of transparency about changes, perceived unfairness around invigilation, and uncertainty around changes in assessment format and weighting contributed to dissatisfaction.

Conclusions: Online open-book assessments were seen as the most ideal amongst all participants, and students who experienced these were the most satisfied with their assessment change. They were perceived as most fair and authentic compared to real-life medical training. We seek to inform educators about student perceptions of successful assessment strategies under COVID-19 restrictions and provide evidence to allow debate on ongoing assessment reform and innovation. While this work looks specifically at assessment changes during COVID-19, understanding factors affecting student perception of assessment is applicable to examinations beyond COVID-19.

Keywords: Assessment; COVID-19; Examinations; Medical education; Medical students; Online examination; Open-book; Proctoring; Supervised.

PubMed Disclaimer

Conflict of interest statement

There are no competing interests to declare.

Figures

Fig. 1
Fig. 1
Frequency of assessment format changes and satisfaction with changes. The frequency of the assessment weighting changes (A) and satisfaction amongst those experiencing each assessment weighting change (B) during the COVID-19 pandemic in 119 medical students were plotted in a bar graph in order of satisfaction from most popular at the top, to least popular at the bottom. A. The bar graph depicts the number of medical students that had each change in weighting. B. The bar graph shows the percentage satisfaction with a certain weighting change with 5 (dark green) = extremely satisfied, 4 (light green) = satisfied, 3 (yellow) = neutral, 2 (orange) = unsatisfied and 1(red) = extremely unsatisfied
Fig. 2
Fig. 2
Frequency of assessment weighting changes and satisfaction with these changes. The frequency of the assessment format changes (A) and satisfaction amongst those experiencing each format change (B) during the COVID-19 pandemic in 119 medical students were plotted in a bar graph in order of satisfaction from most popular at the top to least popular at the bottom. A The bar graph depicts the number of medical students that experienced each assessment change. B The bar graph shows the percentage satisfaction with each particular assessment change with 5 (dark green) = extremely satisfied, 4 (light green) = satisfied, 3 (yellow) = neutral, 2 (orange) = unsatisfied and 1 (red) = extremely unsatisfied.
Fig. 3
Fig. 3
Highest ranked, ideal assessment formats and features of assessment that affect satisfaction. The bar graph depicts these in order from most popular first to least popular last. A The bar graph depicts the ranking by medical students of their most preferred [1] to least preferred [7] assessment format. B The bar graph depicts the ranking by medical students of factors of an assessment that are most important in affecting their satisfaction [1], to least important [6]
Fig. 4
Fig. 4
Proportion of students experiencing their preferred assessment format in the 2020 summer assessment period

References

    1. Wass V, van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet. 2001. p. 945–949. 10.1016/S0140-6736(00)04221-5. - PubMed
    1. van der Vleuten CPM, Schuwirth LWT, Driessen EW, Dijkstra J, Tigelaar D, Baartman LKJ, et al. A model for programmatic assessment fit for purpose. Medical Teacher. 2012;34(3). 10.3109/0142159X.2012.652239. - PubMed
    1. Newble D. Techniques for measuring clinical competence: Objective structured clinical examinations. Med Educ. 2004;38(2):199–203. doi: 10.1111/j.1365-2923.2004.01755.x. - DOI - PubMed
    1. van der Vleuten C. Validity of final examinations in undergraduate medical training. BMJ. 2000 doi: 10.1136/bmj.321.7270.1217. - DOI - PMC - PubMed
    1. Tabish SA. Assessment Methods in Medical Education. Int J Health Sci (Qassim) 2008;2(2):3–7. - PMC - PubMed