Reliability of simulation-based assessment for practicing physicians: performance is context-specific
- PMID: 33845837
- PMCID: PMC8042680
- DOI: 10.1186/s12909-021-02617-8
Reliability of simulation-based assessment for practicing physicians: performance is context-specific
Abstract
Introduction: Even physicians who routinely work in complex, dynamic practices may be unprepared to optimally manage challenging critical events. High-fidelity simulation can realistically mimic critical clinically relevant events, however the reliability and validity of simulation-based assessment scores for practicing physicians has not been established.
Methods: Standardised complex simulation scenarios were developed and administered to board-certified, practicing anesthesiologists who volunteered to participate in an assessment study during formative maintenance of certification activities. A subset of the study population agreed to participate as the primary responder in a second scenario for this study. The physicians were assessed independently by trained raters on both teamwork/behavioural and technical performance measures. Analysis using Generalisability and Decision studies were completed for the two scenarios with two raters.
Results: The behavioural score was not more reliable than the technical score. With two raters > 20 scenarios would be required to achieve a reliability estimate of 0.7. Increasing the number of raters for a given scenario would have little effect on reliability.
Conclusions: The performance of practicing physicians on simulated critical events may be highly context-specific. Realistic simulation-based assessment for practicing physicians is resource-intensive and may be best-suited for individualized formative feedback. More importantly, aggregate data from a population of participants may have an even higher impact if used to identify skill or knowledge gaps to be addressed by training programs and inform continuing education improvements across the profession.
Keywords: Assessment; Competency; Continuing medical education; Feedback; Generalisability; Practicing physicians; Program evaluation; Simulation.
Conflict of interest statement
The authors declare that they have no competing interests.
References
-
- McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011;86(6):706–711. doi: 10.1097/ACM.0b013e318217e119. - DOI - PMC - PubMed
-
- Weinger MB, Banerjee A, Burden AR, McIvor WR, Boulet J, Cooper JB, Steadman R, Shotwell MS, Slagle JM, DeMaria S, Torsher L, Sinz E, Levine AI, Rask J, Davis F, Park C, Gaba DM. Simulation-based assessment of the Management of Critical Events by board-certified anesthesiologists. Anesthesiology. 2017;127(3):475–489. doi: 10.1097/ALN.0000000000001739. - DOI - PubMed
-
- Durning SJ, Artino AR, Boulet JR, Dorrance K, van der Vleuten C, Schuwirth L. The impact of selected contextual factors on experts' clinical reasoning performance (does context impact clinical reasoning performance in experts?). Adv Health Sci Educ. 2012;17:65–79. 10.1007/s10459-011-9294-3. - PubMed
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
Other Literature Sources
