A practical approach to programmatic assessment design
- PMID: 28120259
- PMCID: PMC5663798
- DOI: 10.1007/s10459-017-9756-3
A practical approach to programmatic assessment design
Abstract
Assessment of complex tasks integrating several competencies calls for a programmatic design approach. As single instruments do not provide the information required to reach a robust judgment of integral performance, 73 guidelines for programmatic assessment design were developed. When simultaneously applying these interrelated guidelines, it is challenging to keep a clear overview of all assessment activities. The goal of this study was to provide practical support for applying a programmatic approach to assessment design, not bound to any specific educational paradigm. The guidelines were first applied in a postgraduate medical training setting, and a process analysis was conducted. This resulted in the identification of four steps for programmatic assessment design: evaluation, contextualisation, prioritisation and justification. Firstly, the (re)design process starts with sufficiently detailing the assessment environment and formulating the principal purpose. Key stakeholders with sufficient (assessment) expertise need to be involved in the analysis of strengths and weaknesses and identification of developmental needs. Central governance is essential to balance efforts and stakes with the principal purpose and decide on prioritisation of design decisions and selection of relevant guidelines. Finally, justification of assessment design decisions, quality assurance and external accountability close the loop, to ensure sound underpinning and continuous improvement of the assessment programme.
Keywords: Assessment quality; Instructional design; Medical education; Programmatic assessment; Quality assurance.
Conflict of interest statement
The authors report no conflicts of interest; no funding was received for the present study. The authors alone are responsible for the content and writing of this article.
Figures
References
-
- American Educational Research Association (AERA) Standards for educational and psychological testing. Washington: AERA; 2014.
-
- Baartman LKJ, Bastiaens TJ, Kirschner PA, van der Vleuten CPM. The wheel of competency assessment: presenting quality criteria for competency assessment programs. Studies in Educational Evaluation. 2006;32:153–170. doi: 10.1016/j.stueduc.2006.04.006. - DOI
-
- Baartman LKJ, Prins FJ, Kirschner PA, van der Vleuten CPM. Determining the quality of assessment programs: a self-evaluation procedure. Studies in Educational Evaluation. 2007;33:258–281. doi: 10.1016/j.stueduc.2007.07.004. - DOI
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources
Research Materials