Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2021 Jun 13;21(1):342.
doi: 10.1186/s12909-021-02759-9.

Comparison of electronic versus conventional assessment methods in ophthalmology residents; a learner assessment scholarship study

Affiliations

Comparison of electronic versus conventional assessment methods in ophthalmology residents; a learner assessment scholarship study

Hamidreza Hasani et al. BMC Med Educ. .

Abstract

Background: Assessment is a necessary part of training postgraduate medical residents. The implementation of methods located at the "shows how" level of Miller's pyramid is believed to be more effective than previous conventional tools. In this study, we quantitatively compared electronic and conventional methods in assessing ophthalmology residents.

Methods: In this retrospective study, eight different conventional methods of assessment including residents' attendance, logbook, scholarship and research skills, journal club, outpatient department participation, Multiple Choice Question (MCQ), Objective Structured Clinical Examination (OSCE), and professionalism/360-degree (as one complex) were used to assess 24 ophthalmology residents of all grades. Electronic media consisting of an online Patient Management Problem (e-PMP), and modified electronic OSCE (me-OSCE) tests performed 3 weeks later were also evaluated for each of the 24 residents. Quantitative analysis was then performed comparing the conventional and electronic assessment tools, statistically assessing the correlation between the two approaches.

Results: Twenty-four ophthalmology residents of different grades were included in this study. In the electronic assessment, average e-PMP scores (48.01 ± 12.40) were much lower than me-OSCE (65.34 ± 17.11). The total average electronic score was 56.67 ± 11.28, while the total average conventional score was 80.74 ± 5.99. Female and male residents' average scores in the electronic and conventional method were (59.15 ± 12.32 versus 83.01 ± 4.95) and (55.19 ± 10.77 versus 79.38 ± 6.29), respectively. The correlation between modified electronic OSCE and all conventional methods was not statistically significant (P-value >0.05). Correlation between e-PMP and six conventional methods, consisting of professionalism/360-degree assessment tool, logbook, research skills, Multiple Choice Questions, Outpatient department participation, and Journal club active participation was statistically significant (P-value < 0.05). The overall correlation between conventional and electronic methods was significant (P-value = 0.017).

Conclusion: In this study, we conclude that electronic PMP can be used alongside all conventional tools, and overall, e-assessment methods could replace currently used conventional methods. Combined electronic PMP and me-OSCE can be used as a replacement for currently used gold-standard assessment methods, including 360-degree assessment.

Keywords: Assessment; Conventional; Electronic; Ophthalmology residents; Scholarship study.

PubMed Disclaimer

Conflict of interest statement

The authors declare that they have no conflict of interests.

Figures

Fig. 1
Fig. 1
Modified Miller’s pyramid adopted from Miller GE [10] and Cruess RL [11]. Legend: Modified (od amended) Miller’s pyramid presented by Cruess RL. et al. in 2016 after the original pyramid presented by Miller in 1990. A “to be” level is added in the apex of “amended Miller’s pyramid” compared to the original one
Fig. 2
Fig. 2
Mean scores achieved through Assessment tools by residents; *Professionalism/ 360-degree. Legend: Mean achieved scores by all residents is seven conventional, two electronic, and total electronic and conventional assessment methods. As demonstrated, female residents’ scores higher in every one of the assessment tools
Fig. 3
Fig. 3
Scatter-plot indicating the nature of data and its distribution. Legend: scatter-plot indicating the nature of data and its distribution and the outliers is demonstrated. a The X-axis is the total conventional score and the Y-axis is the total electronic score. b The X-axis is the me-OSCE score and the Y-axis is the standard-OSCE score. c The X-axis is the me-OSCE score and the Y-axis is the e-PMP score. d The X-axis is the standard-OSCE score and the Y-axis is the e-PMP score
Fig. 4
Fig. 4
Scatter-plot indicating the nature of data and its distribution, while outliers are removed. Legend: scatter-plot indicating the nature of data and its distribution and the outliers are demonstrated, while outliers are removed. a The X-axis is the total conventional score and the Y-axis is the total electronic score. b The X-axis is the me-OSCE score and the Y-axis is the standard-OSCE score. c The X-axis is the me-OSCE score and the Y-axis is the e-PMP score. d The X-axis is the standard-OSCE score and the Y-axis is the e-PMP score

References

    1. Epstein RM, Hundert EM. Defining and assessing professional competence. Jama. 2002;287(2):226–235. doi: 10.1001/jama.287.2.226. - DOI - PubMed
    1. Arora S, Ashrafian H, Davis R, Athanasiou T, Darzi A, Sevdalis N. Emotional intelligence in medicine: a systematic review through the context of the ACGME competencies. Med Educ. 2010;44(8):749–764. doi: 10.1111/j.1365-2923.2010.03709.x. - DOI - PubMed
    1. Maeshiro R, Johnson I, Koo D, et al. Medical education for a healthier population: reflections on the Flexner report from a public health perspective. Acad Med. 2010;85(2):211–219. doi: 10.1097/ACM.0b013e3181c885d8. - DOI - PubMed
    1. Park YS, Hodges BD, Tekian A. Evaluating the paradigm shift from time-based toward competency-based medical education: implications for curriculum and assessment. Assessing Competence in Professional Performance across Disciplines and Professions: Springer, 2016:411–425.
    1. Harden RM. AMEE guide no. 14: outcome-based education: part 1-an introduction to outcome-based education. Medical Teacher. 1999;21(1):7–14. doi: 10.1080/01421599979969. - DOI - PubMed

LinkOut - more resources