Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Feb;37(3):507-512.
doi: 10.1007/s11606-021-06805-6. Epub 2021 May 4.

Development of a Clinical Reasoning Documentation Assessment Tool for Resident and Fellow Admission Notes: a Shared Mental Model for Feedback

Affiliations

Development of a Clinical Reasoning Documentation Assessment Tool for Resident and Fellow Admission Notes: a Shared Mental Model for Feedback

Verity Schaye et al. J Gen Intern Med. 2022 Feb.

Abstract

Background: Residents and fellows receive little feedback on their clinical reasoning documentation. Barriers include lack of a shared mental model and variability in the reliability and validity of existing assessment tools. Of the existing tools, the IDEA assessment tool includes a robust assessment of clinical reasoning documentation focusing on four elements (interpretive summary, differential diagnosis, explanation of reasoning for lead and alternative diagnoses) but lacks descriptive anchors threatening its reliability.

Objective: Our goal was to develop a valid and reliable assessment tool for clinical reasoning documentation building off the IDEA assessment tool.

Design, participants, and main measures: The Revised-IDEA assessment tool was developed by four clinician educators through iterative review of admission notes written by medicine residents and fellows and subsequently piloted with additional faculty to ensure response process validity. A random sample of 252 notes from July 2014 to June 2017 written by 30 trainees across several chief complaints was rated. Three raters rated 20% of the notes to demonstrate internal structure validity. A quality cut-off score was determined using Hofstee standard setting.

Key results: The Revised-IDEA assessment tool includes the same four domains as the IDEA assessment tool with more detailed descriptive prompts, new Likert scale anchors, and a score range of 0-10. Intraclass correlation was high for the notes rated by three raters, 0.84 (95% CI 0.74-0.90). Scores ≥6 were determined to demonstrate high-quality clinical reasoning documentation. Only 53% of notes (134/252) were high-quality.

Conclusions: The Revised-IDEA assessment tool is reliable and easy to use for feedback on clinical reasoning documentation in resident and fellow admission notes with descriptive anchors that facilitate a shared mental model for feedback.

Keywords: assessment; clinical reasoning; documentation; feedback.

PubMed Disclaimer

Conflict of interest statement

The authors declare that they do not have a conflict of interest.

Figures

Figure 1
Figure 1
The Revised-IDEA assessment tool for clinical reasoning documentation.
Figure 2
Figure 2
Revised-IDEA cut-off score ≥ 6 determined by the Hofstee standard setting method. Dashed lines indicate average minimally acceptable and maximally acceptable failure rates determined by the panel of 4 physicians. Dotted lines indicate average minimally acceptable and maximally acceptable cut-off scores determined by the panel of 4 physicians. A, where minimally acceptable cut-off score intersects with maximally acceptable failure rate. B, where maximally acceptable cut-off score intersects with minimally acceptable failure rate. C, Revised-IDEA cut-off score where the line joining A and B intersects with the distribution curve.

References

    1. Accreditation Council for Graduate Medical Education. ACGME common program requirements. Available at: http://www.acgme.org/Portals/0/PFAssets/ProgramRequirements/CPRs_2017-07... July 1. Accessed January 17, 2020.
    1. Young M, Thomas A, Lubarsky S, et al. Drawing boundaries: the difficulty in defining clinical reasoning. Acad Med. 2018;93(7):990–995. doi: 10.1097/ACM.0000000000002142. - DOI - PubMed
    1. Daniel M, Rencic J, Durning SJ, et al. Clinical reasoning assessment methods: a scoping review and practical guidance. Acad Med. 2019;94(6):902–912. doi: 10.1097/ACM.0000000000002618. - DOI - PubMed
    1. Bowen JL. Educational strategies to promote clinical diagnostic reasoning. N Engl J Med. 2006;355(21):2217–2225. doi: 10.1056/NEJMra054782. - DOI - PubMed
    1. Accreditation Council for Graduate Medical Education. ACGME core program requirements. Available at: http://www.acgme.org/portals/0/pdfs/milestones/internalmedicinemilestone.... Accessed January 17, 2020.

Publication types

LinkOut - more resources