Increased accessibility of computer-based testing for residency application to a hospital in Brazil with item characteristics comparable to paper-based testing: a psychometric study
- PMID: 39523575
- PMCID: PMC11637595
- DOI: 10.3352/jeehp.2024.21.32
Increased accessibility of computer-based testing for residency application to a hospital in Brazil with item characteristics comparable to paper-based testing: a psychometric study
Abstract
Purpose: With the coronavirus disease 2019 pandemic, online high-stakes exams have become a viable alternative. This study evaluated the feasibility of computer-based testing (CBT) for medical residency applications in Brazil and its impacts on item quality and applicants’ access compared to paper-based testing.
Methods: In 2020, an online CBT was conducted in a Ribeirao Preto Clinical Hospital in Brazil. In total, 120 multiple-choice question items were constructed. Two years later, the exam was performed as paper-based testing. Item construction processes were similar for both exams. Difficulty and discrimination indexes, point-biserial coefficient, difficulty, discrimination, guessing parameters, and Cronbach’s α coefficient were measured based on the item response and classical test theories. Internet stability for applicants was monitored.
Results: In 2020, 4,846 individuals (57.1% female, mean age of 26.64±3.37 years) applied to the residency program, versus 2,196 individuals (55.2% female, mean age of 26.47±3.20 years) in 2022. For CBT, there was an increase of 2,650 applicants (120.7%), albeit with significant differences in demographic characteristics. There was a significant increase in applicants from more distant and lower-income Brazilian regions, such as the North (5.6% vs. 2.7%) and Northeast (16.9% vs. 9.0%). No significant differences were found in difficulty and discrimination indexes, point-biserial coefficients, and Cronbach’s α coefficients between the 2 exams.
Conclusion: Online CBT with multiple-choice questions was a viable format for a residency application exam, improving accessibility without compromising exam integrity and quality.
Keywords: Brazil; COVID-19; Computers; Educational measurement; Feasibility studies; Internship and residency.
Conflict of interest statement
PF and PRAG are employees of eduCat, which developed the web-based platform. Except for that, no potential conflict of interest relevant to this article was reported.
Figures




References
-
- Swan Sein A, Dathatri S, Bates TA. Twelve tips on guiding preparation for both high-stakes exams and long-term learning. Med Teach. 2021;43:518–523. doi: 10.1080/0142159X.2020.1828570. https://doi.org/10.1080/0142159X.2020.1828570 . - DOI - DOI - PubMed
-
- Karay Y, Schauber SK, Stosch C, Schuttpelz-Brauns K. Computer versus paper: does it make any difference in test performance? Teach Learn Med. 2015;27:57–62. doi: 10.1080/10401334.2014.979175. https://doi.org/10.1080/10401334.2014.979175 . - DOI - DOI - PubMed
-
- Saleh MN, Salem TA, Alamro AS, Wadi MM. Web-based and paper-based examinations: lessons learnt during the COVID-19 pandemic lockdown. J Taibah Univ Med Sci. 2022;17:128–136. doi: 10.1016/j.jtumed.2021.09.004. https://doi.org/10.1016/j.jtumed.2021.09.004 . - DOI - DOI - PMC - PubMed
-
- Milone AS, Cortese AM, Balestrieri RL, Pittenger AL. The impact of proctored online exams on the educational experience. Curr Pharm Teach Learn. 2017;9:108–114. doi: 10.1016/j.cptl.2016.08.037. https://doi.org/10.1016/j.cptl.2016.08.037 . - DOI - DOI - PubMed
-
- Coghlan S, Miller T, Paterson J. Good proctor or “big brother”?: ethics of online exam supervision technologies. Philos Technol. 2021;34:1581–1606. doi: 10.1007/s13347-021-00476-1. https://doi.org/10.1007/s13347-021-00476-1 . - DOI - DOI - PMC - PubMed
MeSH terms
LinkOut - more resources
Full Text Sources
Medical