Crowd-sourced assessment of technical skill: A valid method for discriminating basic robotic surgery skills

Lee W. White, Timothy M. Kowalewski, Rodney Lee Dockter, Bryan Comstock, Blake Hannaford, Thomas S. Lendvay

Research output: Contribution to journalArticlepeer-review

69 Scopus citations

Abstract

Background: A surgeon's skill in the operating room has been shown to correlate with a patient's clinical outcome. The prompt accurate assessment of surgical skill remains a challenge, in part, because expert faculty reviewers are often unavailable. By harnessing the power of large readily available crowds through the Internet, rapid, accurate, and low-cost assessments may be achieved. We hypothesized that assessments provided by crowd workers highly correlate with expert surgeons' assessments. Materials and Methods: A group of 49 surgeons from two hospitals performed two dry-laboratory robotic surgical skill assessment tasks. The performance of these tasks was video recorded and posted online for evaluation using Amazon Mechanical Turk. The surgical tasks in each video were graded by (n=30) varying crowd workers and (n=3) experts using a modified global evaluative assessment of Robotic Skills (GEARS) grading tool, and the mean scores were compared using Cronbach's alpha statistic. Results: GEARS evaluations from the crowd were obtained for each video and task and compared with the GEARS ratings from the expert surgeons. The crowd-based performance scores agreed with the performance assessments by experts with a Cronbach's alpha of 0.84 and 0.92 for the two tasks, respectively. Conclusion: The assessment of surgical skill by crowd workers resulted in a high degree of agreement with the scores provided by expert surgeons in the evaluation of basic robotic surgical dry-laboratory tasks. Crowd responses cost less and were much faster to acquire. This study provides evidence that crowds may provide an adjunctive method for rapidly providing feedback of skills to training and practicing surgeons.

Original languageEnglish (US)
Pages (from-to)1295-1301
Number of pages7
JournalJournal of Endourology
Volume29
Issue number11
DOIs
StatePublished - Nov 2015

Fingerprint

Dive into the research topics of 'Crowd-sourced assessment of technical skill: A valid method for discriminating basic robotic surgery skills'. Together they form a unique fingerprint.

Cite this