Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Comparative Study
. 2017 Nov 9;17(1):192.
doi: 10.1186/s12909-017-1051-8.

Comparison of formula and number-right scoring in undergraduate medical training: a Rasch model analysis

Affiliations
Comparative Study

Comparison of formula and number-right scoring in undergraduate medical training: a Rasch model analysis

Dario Cecilio-Fernandes et al. BMC Med Educ. .

Abstract

Background: Progress testing is an assessment tool used to periodically assess all students at the end-of-curriculum level. Because students cannot know everything, it is important that they recognize their lack of knowledge. For that reason, the formula-scoring method has usually been used. However, where partial knowledge needs to be taken into account, the number-right scoring method is used. Research comparing both methods has yielded conflicting results. As far as we know, in all these studies, Classical Test Theory or Generalizability Theory was used to analyze the data. In contrast to these studies, we will explore the use of the Rasch model to compare both methods.

Methods: A 2 × 2 crossover design was used in a study where 298 students from four medical schools participated. A sample of 200 previously used questions from the progress tests was selected. The data were analyzed using the Rasch model, which provides fit parameters, reliability coefficients, and response option analysis.

Results: The fit parameters were in the optimal interval ranging from 0.50 to 1.50, and the means were around 1.00. The person and item reliability coefficients were higher in the number-right condition than in the formula-scoring condition. The response option analysis showed that the majority of dysfunctional items emerged in the formula-scoring condition.

Conclusions: The findings of this study support the use of number-right scoring over formula scoring. Rasch model analyses showed that tests with number-right scoring have better psychometric properties than formula scoring. However, choosing the appropriate scoring method should depend not only on psychometric properties but also on self-directed test-taking strategies and metacognitive skills.

Keywords: Assessment; Construct-irrelevant variance; Formula scoring; Multiple choice questions; Number-right scoring; Rasch model; Reliability; Validity.

PubMed Disclaimer

Conflict of interest statement

Ethics approval and consent to participate

The data were collected for another study at a time when there was no formal ethical approval process for such studies, and ethical approval was not sought. At the moment, there is an ethical approval committee, but a reanalysis of historical data is automatically ruled exempt. Our work was carried out in accordance with the Declaration of Helsinki and the privacy policy of the University of Groningen. Before the analysis, all data were anonymized and handled with confidentiality.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Figures

Fig. 1
Fig. 1
Map of question difficulty and student ability for Test 1. Left hand side shows questions under the formula scoring method and the right hand side shows questions under the right number scoring method
Fig. 2
Fig. 2
Map of question difficulty and student ability for Test 2. Left hand side shows questions under the formula scoring method and the right hand side shows questions under the right number scoring method

Similar articles

Cited by

References

    1. Muijtjens AM, Schuwirth LT, Cohen-Schotanus J. Differences in knowledge development exposed by multi-curricular progress test data. Adv Health Sci Educ. 2008;13:593–605. doi: 10.1007/s10459-007-9066-2. - DOI - PubMed
    1. Wrigley W, Van der Vleuten CPM, Freeman A, Muijtjens A. A systemic framework for the progress test: strengths, constraints and issues: AMEE guide no. 71. Med Teach. 2012;31:683–697. doi: 10.3109/0142159X.2012.704437. - DOI - PubMed
    1. De Champlain AF, Cuddy MM, Scoles PV, Brown M, Swanson DB, Holtzman K, et al. Progress testing in clinical science education: results of a pilot project between the National Board of medical examiners and a US medical school. Med Teach. 2010;32:503–508. doi: 10.3109/01421590903514655. - DOI - PubMed
    1. Schuwirth LWT, Van der Vleuten CPM. The use of progress testing. Perspect Med Educ. 2012;1(1):24–30. doi: 10.1007/s40037-012-0007-2. - DOI - PMC - PubMed
    1. Lord FM. Formula scoring and number-right scoring. J Educ Meas. 1975;12(1):7–11. doi: 10.1111/j.1745-3984.1975.tb01003.x. - DOI

Publication types

LinkOut - more resources