Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2016 Mar 25;80(2):29.
doi: 10.5688/ajpe80229.

Development of a Summative Examination with Subject Matter Expert Validation

Affiliations

Development of a Summative Examination with Subject Matter Expert Validation

Ashley N Castleberry et al. Am J Pharm Educ. .

Abstract

Objective. To describe the development, implementation and impact of a summative examination on student learning and programmatic curricular outcomes. Methods. The summative examination was developed using a systematic approach. Item reliability was evaluated using standard psychometric analyses. Content validity was assessed using necessity scoring as determined by subject matter experts. Results. Almost 700 items written by 37 faculty members were evaluated. Passing standards increased annually (45% in 2009 to 67% in 2014) as the result of targeting item difficulty and necessity scores. The percentage of items exhibiting discrimination above 0.1 increased to 100% over the four years. Necessity scores above 2.75 out of 4 increased from 65% to 100% of items over six years of examination administration. Conclusion. This examination successfully assessed student and curricular outcomes. Faculty member engagement observed in this process supports a culture of assessment. This type of examination could be beneficial to other programs.

Keywords: assessment; examination development; subject matter experts; summative examination.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Development and Administration of the Summative Examination.
Figure 2.
Figure 2.
Passing Standard Calculations.
Figure 3.
Figure 3.
Sample Student Report Card.
Figure 4.
Figure 4.
Necessity Scores by Year. Subject matter experts’ second response to the following question for each item: To what extent is the knowledge, skill, or ability measured by this item necessary for minimally acceptable performance as a rising third-year student? (1=item is not necessary, 2=item is useful, 3=item is important, 4=item is essential or critical).

References

    1. Anderson HM, Guadalupe A, Bird E, Moore DL. A review of educational assessment. Am J Pharm Educ. 2005;69(1):Article 12.
    1. Formative Assessment. The Glossary of Education Reform, Great Schools Partnership. http://edglossary.org/formative-assessment/. Accessed October 20, 2014.
    1. Alston GL, Love BL. Development of a reliable, valid annual skills mastery assessment examination. Am J Pharm Educ. 2010;74(5):Article 80. - PMC - PubMed
    1. Plaza CM. Progress examinations in pharmacy education. Am J Pharm Educ. 2007;71(4):Article 66. - PMC - PubMed
    1. Szilagyi JE. Curricular progress assessments: the MileMarker. Am J Pharm Educ. 2008;72(5):Article 101. - PMC - PubMed

LinkOut - more resources