Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2017 Dec;22(5):1169-1182.
doi: 10.1007/s10459-017-9756-3. Epub 2017 Jan 24.

A practical approach to programmatic assessment design

Affiliations

A practical approach to programmatic assessment design

A A Timmerman et al. Adv Health Sci Educ Theory Pract. 2017 Dec.

Abstract

Assessment of complex tasks integrating several competencies calls for a programmatic design approach. As single instruments do not provide the information required to reach a robust judgment of integral performance, 73 guidelines for programmatic assessment design were developed. When simultaneously applying these interrelated guidelines, it is challenging to keep a clear overview of all assessment activities. The goal of this study was to provide practical support for applying a programmatic approach to assessment design, not bound to any specific educational paradigm. The guidelines were first applied in a postgraduate medical training setting, and a process analysis was conducted. This resulted in the identification of four steps for programmatic assessment design: evaluation, contextualisation, prioritisation and justification. Firstly, the (re)design process starts with sufficiently detailing the assessment environment and formulating the principal purpose. Key stakeholders with sufficient (assessment) expertise need to be involved in the analysis of strengths and weaknesses and identification of developmental needs. Central governance is essential to balance efforts and stakes with the principal purpose and decide on prioritisation of design decisions and selection of relevant guidelines. Finally, justification of assessment design decisions, quality assurance and external accountability close the loop, to ensure sound underpinning and continuous improvement of the assessment programme.

Keywords: Assessment quality; Instructional design; Medical education; Programmatic assessment; Quality assurance.

PubMed Disclaimer

Conflict of interest statement

The authors report no conflicts of interest; no funding was received for the present study. The authors alone are responsible for the content and writing of this article.

Figures

Fig. 1
Fig. 1
Methodological approach to case study, process analysis and deconstruction

References

    1. American Educational Research Association (AERA) Standards for educational and psychological testing. Washington: AERA; 2014.
    1. Baartman LKJ, Bastiaens TJ, Kirschner PA, van der Vleuten CPM. The wheel of competency assessment: presenting quality criteria for competency assessment programs. Studies in Educational Evaluation. 2006;32:153–170. doi: 10.1016/j.stueduc.2006.04.006. - DOI
    1. Baartman LKJ, Prins FJ, Kirschner PA, van der Vleuten CPM. Determining the quality of assessment programs: a self-evaluation procedure. Studies in Educational Evaluation. 2007;33:258–281. doi: 10.1016/j.stueduc.2007.07.004. - DOI
    1. Bok HGJ, Teunissen PW, Favier RP, Rietbroek NJ, Theyse LFH, Brommer H, Haarhuis JCM, van Breukelen P, van der Vleuten CPM, Jaarsma DADC. Programmatic assessment of competency-based workplace learning: When theory meets practice. BMC Medical Education. 2013;13(123):1–9. - PMC - PubMed
    1. Dannefer EF, Henson LC. The portfolio approach to competency based assessment at the Cleveland Clinic Lerner Colleg of Medicine. Academic Medicine. 2007;82:493–502. doi: 10.1097/ACM.0b013e31803ead30. - DOI - PubMed

MeSH terms

LinkOut - more resources