Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2017 Dec;9(6):697-705.
doi: 10.4300/JGME-D-17-00322.1.

Crowdsourcing in Surgical Skills Acquisition: A Developing Technology in Surgical Education

Crowdsourcing in Surgical Skills Acquisition: A Developing Technology in Surgical Education

Jessica C Dai et al. J Grad Med Educ. 2017 Dec.

Abstract

Background: The application of crowdsourcing to surgical education is a recent phenomenon and adds to increasing demands on surgical residency training. The efficacy, range, and scope of this technology for surgical education remains incompletely defined.

Objective: A systematic review was performed using the PubMed database of English-language literature on crowdsourced evaluation of surgical technical tasks up to April 2017.

Methods: Articles were reviewed, abstracted, and analyzed, and were assessed for quality using the Medical Education Research Study Quality Instrument (MERSQI). Articles were evaluated with eligibility criteria for inclusion. Study information, performance task, subjects, evaluative standards, crowdworker compensation, time to response, and correlation between crowd and expert or standard evaluations were abstracted and analyzed.

Results: Of 63 unique publications initially identified, 13 with MERSQI scores ranging from 10 to 13 (mean = 11.85) were included in the review. Overall, crowd and expert evaluations demonstrated good to excellent correlation across a wide range of tasks (Pearson's coefficient 0.59-0.95, Cronbach's alpha 0.32-0.92), with 1 exception being a study involving medical students. There was a wide range of reported interrater variability among experts. Nonexpert evaluation was consistently quicker than expert evaluation (ranging from 4.8 to 150.9 times faster), and was more cost effective.

Conclusions: Crowdsourced feedback appears to be comparable to expert feedback and is cost effective and efficient. Further work is needed to increase consistency in expert evaluations, to explore sources of discrepant assessments between surgeons and crowds, and to identify optimal populations and novel applications for this technology.

PubMed Disclaimer

Conflict of interest statement

Conflict of interest: Dr. Lendvay is co-founder and chief medical officer of C-SATS Inc, a commercially available platform for surgical skills improvement based on the technology described within this review.

Figures

Figure 1
Figure 1
Flow Diagram for Study Inclusion and Exclusion in Systematic Review
Figure 2
Figure 2
Summary of Crowd and Expert Correlations of Procedural Performance Across Included Studies

References

    1. Reznick RK, MacRae H. . Teaching surgical skills—changes in the wind. N Engl J Med. 2006; 355 25: 2664– 2669. - PubMed
    1. Polavarapu HV, Kulaylat AN, Sun S HO. . 100 years of surgical education: the past, present, and future.” Bull Am Coll Surg. 2013; 98 7: 22– 27. - PubMed
    1. Institute of Medicine Committee on Quality of Health Care in America. Crossing the quality chasm: a new health system for the 21st century. : Kohn LT, Corrigan JM, Donaldson MS, . To Err is Human: Building a Safer Health System. Washington, DC: National Academies Press; 2000. - PubMed
    1. Altman DE, Clancy C, Blendon RJ. . Improving patient safety—five years after the IOM report. N Engl J Med. 2004; 351 20: 2041– 2043. - PubMed
    1. Rogers SO Jr, Gawande AA, Kwaan M, et al. . Analysis of surgical errors in closed malpractice claims at 4 liability insurers. Surgery. 2006; 140 1: 25– 33. - PubMed

Publication types

LinkOut - more resources