Crowdsourced assessment of surgical skills: A systematic review

Rikke G Olsen*, Malthe F Genét, Lars Konge, Flemming Bjerrum

*Corresponding author af dette arbejde

Publikation: Bidrag til tidsskriftReviewForskningpeer review

Abstract

INTRODUCTION: Crowdsourced assessment utilizes a large group of untrained individuals from the general population to solve tasks in the medical field. The aim was to examine the correlation between crowd workers and expert surgeons for the use of crowdsourced assessments of surgical skills.

MATERIAL AND METHODS: A systematic literature review was performed on April 14th, 2021 from inception to the present. Two reviewers screened all articles with eligibility criteria of inclusion and assessed for quality using The Medical Education Research Study Quality Instrument (MERSQI) and Newcastle-Ottawa Scale-Education (NOS-E)(Holst et al., 2015).7General information was extracted for each article.

RESULTS: 250 potential studies were identified, and 32 articles were included. There appeared to be a generally moderate to very strong correlation between crowd workers and experts (Cronbach's alpha 0.72-0.95, Pearson's r 0.7-0.95, Spearman Rho 0.7-0.89, linear regression 0.45-0.89). Six studies had either questionable or no significant correlation between crowd workers and experts.

CONCLUSION: Crowdsourced assessment can provide accurate, rapid, cost-effective, and objective feedback across different specialties and types of surgeries in dry lab, simulation, and live surgeries.

OriginalsprogEngelsk
Sider (fra-til)1229-1237
Antal sider9
TidsskriftAmerican Journal of Surgery
Vol/bind224
Udgave nummer5
Tidlig onlinedato18 jul. 2022
DOI
StatusUdgivet - nov. 2022

Bibliografisk note

Copyright © 2022 Elsevier Inc. All rights reserved.

Fingeraftryk

Udforsk hvilke forskningsemner 'Crowdsourced assessment of surgical skills: A systematic review' indeholder.

Citationsformater