Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Jul;37(9):2224-2229.
doi: 10.1007/s11606-022-07513-5. Epub 2022 Jun 16.

REACT: Rapid Evaluation Assessment of Clinical Reasoning Tool

Affiliations

REACT: Rapid Evaluation Assessment of Clinical Reasoning Tool

Brian D Peterson et al. J Gen Intern Med. 2022 Jul.

Abstract

Introduction: Clinical reasoning encompasses the process of data collection, synthesis, and interpretation to generate a working diagnosis and make management decisions. Situated cognition theory suggests that knowledge is relative to contextual factors, and clinical reasoning in urgent situations is framed by pressure of consequential, time-sensitive decision-making for diagnosis and management. These unique aspects of urgent clinical care may limit the effectiveness of traditional tools to assess, teach, and remediate clinical reasoning.

Methods: Using two validated frameworks, a multidisciplinary group of clinicians trained to remediate clinical reasoning and with experience in urgent clinical care encounters designed the novel Rapid Evaluation Assessment of Clinical Reasoning Tool (REACT). REACT is a behaviorally anchored assessment tool scoring five domains used to provide formative feedback to learners evaluating patients during urgent clinical situations. A pilot study was performed to assess fourth-year medical students during simulated urgent clinical scenarios. Learners were scored using REACT by a separate, multidisciplinary group of clinician educators with no additional training in the clinical reasoning process. REACT scores were analyzed for internal consistency across raters and observations.

Results: Overall internal consistency for the 41 patient simulations as measured by Cronbach's alpha was 0.86. A weighted kappa statistic was used to assess the overall score inter-rater reliability. Moderate reliability was observed at 0.56.

Discussion: To our knowledge, REACT is the first tool designed specifically for formative assessment of a learner's clinical reasoning performance during simulated urgent clinical situations. With evidence of reliability and content validity, this tool guides feedback to learners during high-risk urgent clinical scenarios, with the goal of reducing diagnostic and management errors to limit patient harm.

PubMed Disclaimer

Conflict of interest statement

The authors have no conflicts of interest to disclose.

Figures

Figure 1
Figure 1
Rapid Evaluation Assessment of Clinical Reasoning Tool (REACT).

References

    1. Gruppen LD. Clinical Reasoning: Defining It, Teaching It, Assessing It. Studying It. West J Emerg Med. 2017;18(1):4–7. doi: 10.5811/westjem.2016.11.33191. - DOI - PMC - PubMed
    1. Geoff R, Norman CPMvdV, David I Newble, Diana H.J.M Dolmans, et al. International Handbook of Research in Medical Education. Dordrecht, the Netherlands: Kluwer Academic Publishers, 2002.
    1. Norman G. Research in clinical reasoning: past history and current trends. Med Educ. 2005;39(4):418–27. doi: 10.1111/j.1365-2929.2005.02127.x. - DOI - PubMed
    1. Norman GR, Eva KW. Diagnostic error and clinical reasoning. Med Educ. 2010;44(1):94–100. doi: 10.1111/j.1365-2923.2009.03507.x. - DOI - PubMed
    1. Patel JJ, Bergl PA. Diagnostic vs Management Reasoning. JAMA. 2018;320(17):1818. doi: 10.1001/jama.2018.13354. - DOI - PubMed

LinkOut - more resources