Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2018 Jun;10(3):331-335.
doi: 10.4300/JGME-D-17-00377.1.

Developing an Assessment Framework for Essential Internal Medicine Subspecialty Topics

Developing an Assessment Framework for Essential Internal Medicine Subspecialty Topics

Natasha Chida et al. J Grad Med Educ. 2018 Jun.

Abstract

Background: Assessing residents by direct observation is the preferred assessment method for infrequently encountered subspecialty topics, but this is logistically challenging.

Objective: We developed an assessment framework for internal medicine (IM) residents in subspecialty topics, using tuberculosis diagnosis for proof of concept.

Methods: We used a 4-step process at 8 academic medical centers that entailed (1) creating a 10-item knowledge assessment tool; (2) pilot testing on a sample of 129 IM residents and infectious disease fellow volunteers to evaluate validity evidence; (3) implementing the final tool among 886 resident volunteers; and (4) assessing outcomes via retrospective chart review. Outcomes included tool score, item performance, and rates of obtaining recommended diagnostics.

Results: Following tool development, 10 infectious disease experts provided content validity. Pilot testing showed higher mean scores for fellows compared with residents (7 [SD = 1.8] versus 3.8 [SD = 1.7], respectively, P < .001) and a satisfactory Kuder-Richardson Formula 20 (0.72). Implementation of the tool revealed a 14-minute (SD = 2.0) mean completion time, 61% (541 of 886) response rate, 4.4 (SD = 1.6) mean score, and ≤ 57% correct response rate for 9 of 10 items. On chart review (n = 343), the rate of obtaining each recommended test was ≤ 43% (113 of 261), except for chest x-rays (96%, 328 of 343).

Conclusions: Our assessment framework revealed knowledge and practice gaps in tuberculosis diagnosis in IM residents. Adopting this approach may help ensure assessment is not limited to frequently encountered topics.

PubMed Disclaimer

Conflict of interest statement

Conflict of interest: The authors declare they have no competing interests.

References

    1. Potts JR. Assessment of competence: the Accreditation Council for Graduate Medical Education/residency review committee perspective. Surg Clin North Am. 2016; 96 1: 15– 24. - PubMed
    1. Accreditation Council for Graduate Medical Education; American Board of Internal Medicine. The Internal Medicine Milestone Project. https://www.acgme.org/portals/0/pdfs/milestones/internalmedicinemileston.... Accessed April 24, 2018.
    1. Kogan JR., Holmboe ES., Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review. JAMA. 2009; 302 12: 1316– 1326. - PubMed
    1. Holt KD., Miller RS., Nasca TJ. Residency programs' evaluations of the competencies: data provided to the ACGME about types of assessments used by programs. J Grad Med Educ. 2010; 2 4: 649– 655. - PMC - PubMed
    1. Hauer KE., Chesluk B., Lobst W., et al. Reviewing residents' competence: a qualitative study of the role of clinical competency committees in performance assessment. Acad Med. 2015; 90 8: 1084– 1092. - PubMed

LinkOut - more resources