Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 1993 Aug;114(2):343-50; discussion 350-1.

Use of an Objective Structured Clinical Examination (OSCE) to measure improvement in clinical competence during the surgical internship

Affiliations
  • PMID: 8342135

Use of an Objective Structured Clinical Examination (OSCE) to measure improvement in clinical competence during the surgical internship

D A Sloan et al. Surgery. 1993 Aug.

Abstract

Background: Traditional ward ratings and multiple-choice tests do not reliably assess clinical competence. This study determined the reliability of the Objective Structured Clinical Examination (OSCE) and its sensitivity in detecting the performance gains and deficits in surgical interns.

Methods: A comprehensive 35-station OSCE was administered to 23 incoming interns and seven outgoing interns. The OSCE comprised 17 two-part clinical problems, relying primarily on actual or simulated patients. The reliability of the examination was assessed by coefficient alpha. Significant differences in performance between the two intern groups, between parts A and B, and among the 17 problems were determined by a three-way ANOVA: OSCE performance was also correlated with National Board of Medical Examiners Part II scores.

Results: The reliabilities of part A, part B, and parts A and B combined were 0.72, 0.70, and 0.82, respectively. Overall, the outgoing interns performed significantly better than the incoming interns: 58% +/- 1% mean OSCE score versus 47% +/- 1% (p = 0.0001). The 17 clinical problems differed significantly in difficulty; major performance deficits were seen in both groups of trainees. The correlation of OSCE scores with National Board of Medical Examiners Part II scores was not significant (r = 0.11, p = 0.633).

Conclusions: We conclude that the OSCE is an innovative, reliable tool for evaluating resident competence. Although outgoing interns performed better than did incoming interns, the OSCE scores clearly indicated major performance deficits in all interns.

PubMed Disclaimer

Comment in

Similar articles

Cited by