Crowd-sourced assessment of technical skills: an opportunity for improvement in the assessment of laparoscopic surgical skills
- PMID: 26709011
- DOI: 10.1016/j.amjsurg.2015.09.005
Crowd-sourced assessment of technical skills: an opportunity for improvement in the assessment of laparoscopic surgical skills
Abstract
Background: Objective, unbiased assessment of surgical skills remains a challenge in surgical education. We sought to evaluate the feasibility and reliability of Crowd-Sourced Assessment of Technical Skills.
Methods: Seven volunteer general surgery interns were given time for training and then testing, on laparoscopic peg transfer, precision cutting, and intracorporeal knot-tying. Six faculty experts (FEs) and 203 Amazon.com Mechanical Turk crowd workers (CWs) evaluated 21 deidentified video clips using the Global Objective Assessment of Laparoscopic Skills validated rating instrument.
Results: Within 19 hours and 15 minutes we received 662 eligible ratings from 203 CWs and 126 ratings from 6 FEs over 10 days. FE video ratings were of borderline internal consistency (Krippendorff's alpha = .55). FE ratings were highly correlated with CW ratings (Pearson's correlation coefficient = .78, P < .001).
Conclusion: We propose the use of Crowd-Sourced Assessment of Technical Skills as a reliable, basic tool to standardize the evaluation of technical skills in general surgery.
Keywords: Crowd sourced data; Psychomotor skills; Surgical skills assessment; Surgical skills education.
Copyright © 2016 Elsevier Inc. All rights reserved.
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources
Miscellaneous
