Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2015 Dec;7(4):567-73.
doi: 10.4300/JGME-D-14-00613.1.

Development and Validation of an Assessment Tool for Competency in Critical Care Ultrasound

Development and Validation of an Assessment Tool for Competency in Critical Care Ultrasound

Paru Patrawalla et al. J Grad Med Educ. 2015 Dec.

Abstract

Background: Point-of-care ultrasound is an emerging technology in critical care medicine. Despite requirements for critical care medicine fellowship programs to demonstrate knowledge and competency in point-of-care ultrasound, tools to guide competency-based training are lacking.

Objective: We describe the development and validity arguments of a competency assessment tool for critical care ultrasound.

Methods: A modified Delphi method was used to develop behaviorally anchored checklists for 2 ultrasound applications: "Perform deep venous thrombosis study (DVT)" and "Qualify left ventricular function using parasternal long axis and parasternal short axis views (Echo)." One live rater and 1 video rater evaluated performance of 28 fellows. A second video rater evaluated a subset of 10 fellows. Validity evidence for content, response process, and internal consistency was assessed.

Results: An expert panel finalized checklists after 2 rounds of a modified Delphi method. The DVT checklist consisted of 13 items, including 1.00 global rating step (GRS). The Echo checklist consisted of 14 items, and included 1.00 GRS for each of 2 views. Interrater reliability evaluated with a Cohen kappa between the live and video rater was 1.00 for the DVT GRS, 0.44 for the PSLA GRS, and 0.58 for the PSSA GRS. Cronbach α was 0.85 for DVT and 0.92 for Echo.

Conclusions: The findings offer preliminary evidence for the validity of competency assessment tools for 2 applications of critical care ultrasound and data on live versus video raters.

PubMed Disclaimer

Conflict of interest statement

Conflict of interest: The authors declare they have no competing interests.

Similar articles

Cited by

References

    1. Kohls-Gatzoulis JA, Regehr G, Hutchison C. Teaching cognitive skills improves learning in surgical skills courses: a blinded, prospective, randomized study. Can J Surg. 2004;47(4):277–283. - PMC - PubMed
    1. Lund ME. Twenty-five questions: an important step on a critical journey. Chest. 2009;135(2):256–258. - PubMed
    1. Davoudi M, Osann K, Colt HG. Validation of two instruments to assess technical bronchoscopic skill using virtual reality simulation. Respiration. 2008;76(1):92–101. - PubMed
    1. Quadrelli S, Davoudi M, Galíndez F, Colt HG. Reliability of a 25-item low-stakes multiple-choice assessment of bronchoscopic knowledge. Chest. 2009;135(2):315–321. - PubMed
    1. Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system—rationale and benefits. N Engl J Med. 2012;366(11):1051–1056. - PubMed

Publication types

LinkOut - more resources