Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Randomized Controlled Trial
. 2009 Jan;24(1):74-9.
doi: 10.1007/s11606-008-0842-3. Epub 2008 Nov 11.

Effect of rater training on reliability and accuracy of mini-CEX scores: a randomized, controlled trial

Affiliations
Randomized Controlled Trial

Effect of rater training on reliability and accuracy of mini-CEX scores: a randomized, controlled trial

David A Cook et al. J Gen Intern Med. 2009 Jan.

Abstract

Background: Mini-CEX scores assess resident competence. Rater training might improve mini-CEX score interrater reliability, but evidence is lacking.

Objective: Evaluate a rater training workshop using interrater reliability and accuracy.

Design: Randomized trial (immediate versus delayed workshop) and single-group pre/post study (randomized groups combined).

Setting: Academic medical center.

Participants: Fifty-two internal medicine clinic preceptors (31 randomized and 21 additional workshop attendees).

Intervention: The workshop included rater error training, performance dimension training, behavioral observation training, and frame of reference training using lecture, video, and facilitated discussion. Delayed group received no intervention until after posttest.

Measurements: Mini-CEX ratings at baseline (just before workshop for workshop group), and four weeks later using videotaped resident-patient encounters; mini-CEX ratings of live resident-patient encounters one year preceding and one year following the workshop; rater confidence using mini-CEX.

Results: Among 31 randomized participants, interrater reliabilities in the delayed group (baseline intraclass correlation coefficient [ICC] 0.43, follow-up 0.53) and workshop group (baseline 0.40, follow-up 0.43) were not significantly different (p = 0.19). Mean ratings were similar at baseline (delayed 4.9 [95% confidence interval 4.6-5.2], workshop 4.8 [4.5-5.1]) and follow-up (delayed 5.4 [5.0-5.7], workshop 5.3 [5.0-5.6]; p = 0.88 for interaction). For the entire cohort, rater confidence (1 = not confident, 6 = very confident) improved from mean (SD) 3.8 (1.4) to 4.4 (1.0), p = 0.018. Interrater reliability for ratings of live encounters (entire cohort) was higher after the workshop (ICC 0.34) than before (ICC 0.18) but the standard error of measurement was similar for both periods.

Conclusions: Rater training did not improve interrater reliability or accuracy of mini-CEX scores.

Clinical trials registration: clinicaltrials.gov identifier NCT00667940

PubMed Disclaimer

Figures

Figure 1
Figure 1
Study design and flow of study participants. Each pretest and posttest consisted of 16 videotaped encounters (nine scripted and seven unscripted). Analyses for interrater reliability and halo effect used all 16 encounters. Analyses for accuracy (discrimination, percent agreement, and chance-corrected agreement) used only the nine scripted encounters. Randomized participants completed the baseline survey with the pretest.

Similar articles

Cited by

References

    1. {'text': '', 'ref_index': 1, 'ids': [{'type': 'PubMed', 'value': '15172901', 'is_inner': True, 'url': 'https://pubmed.ncbi.nlm.nih.gov/15172901/'}]}
    2. Holmboe ES, Hawkins RE, Huot SJ. Effects of training in direct observation of medical residents’ clinical competence: a randomized trial. Ann Intern Med. 2004;140:874–81. - PubMed
    1. {'text': '', 'ref_index': 1, 'ids': [{'type': 'PubMed', 'value': '12639081', 'is_inner': True, 'url': 'https://pubmed.ncbi.nlm.nih.gov/12639081/'}]}
    2. Norcini JJ, Blank LL, Duffy FD, Fortna GS. The mini-CEX: a method for assessing clinical skills. Ann Intern Med. 2003;138:476–81. - PubMed
    1. {'text': '', 'ref_index': 1, 'ids': [{'type': 'DOI', 'value': '10.1097/00001888-200310001-00011', 'is_inner': False, 'url': 'https://doi.org/10.1097/00001888-200310001-00011'}, {'type': 'PubMed', 'value': '14557089', 'is_inner': True, 'url': 'https://pubmed.ncbi.nlm.nih.gov/14557089/'}]}
    2. Kogan JR, Bellini LM, Shea JA. Feasibility, reliability, and validity of the mini-clinical evaluation exercise (mCEX) in a medicine core clerkship. Acad Med. 2003;78(10 Suppl):S33–5. - PubMed
    1. {'text': '', 'ref_index': 1, 'ids': [{'type': 'PubMed', 'value': '9652999', 'is_inner': True, 'url': 'https://pubmed.ncbi.nlm.nih.gov/9652999/'}]}
    2. Holmboe ES, Hawkins RE. Methods for evaluating the clinical competence of residents in internal medicine: a review. Ann Intern Med. 1998;129:42–8. - PubMed
    1. {'text': '', 'ref_index': 1, 'ids': [{'type': 'PubMed', 'value': '6481776', 'is_inner': True, 'url': 'https://pubmed.ncbi.nlm.nih.gov/6481776/'}]}
    2. Woolliscroft JO, Stross JK, Silva J Jr. Clinical competence certification: a critical appraisal. J Med Educ. 1984;59:799–805. - PubMed

Publication types

Associated data