TY - JOUR
T1 - Nonspecialist Raters Can Provide Reliable Assessments of Procedural Skills
AU - Mahmood, Oria
AU - Dagnæs, Julia
AU - Bube, Sarah
AU - Rohrsted, Malene
AU - Konge, Lars
PY - 2018/3/1
Y1 - 2018/3/1
N2 - Background: Competency-based learning has become a crucial component in medical education. Despite the advantages of competency-based learning, there are still challenges that need to be addressed. Currently, the common perception is that specialist assessment is needed for evaluating procedural skills which is difficult owing to the limited availability of faculty time. The aim of this study was to explore the validity of assessments of video recorded procedures performed by nonspecialist raters. Methods: This study was a blinded observational trial. Twenty-three novices (senior medical students) and 9 experienced doctors were video recorded while each performing 2 flexible cystoscopies on patients. The recordings were anonymized and placed in random order and then rated by 2 experienced cystoscopists (specialist raters) and 2 medical students (nonspecialist raters). Flexible cystoscopy was chosen as it is a simple procedural skill that is crucial to master in a resident urology program. Results: The internal consistency of assessments was high, Cronbach's α = 0.93 and 0.95 for nonspecialist and specialist raters, respectively (p < 0.001 for both correlations). The interrater reliability was significant (p < 0.001) with a Pearson's correlation of 0.77 for the nonspecialists and 0.75 for the specialists. The test-retest reliability showed the biggest difference between the 2 groups, 0.59 and 0.38 for the nonspecialist raters and the specialist raters, respectively (p < 0.001). Conclusion: Our study suggests that nonspecialist raters can provide reliable and valid assessments of video recorded cystoscopies. This could make mastery learning and competency-based education more feasible.
AB - Background: Competency-based learning has become a crucial component in medical education. Despite the advantages of competency-based learning, there are still challenges that need to be addressed. Currently, the common perception is that specialist assessment is needed for evaluating procedural skills which is difficult owing to the limited availability of faculty time. The aim of this study was to explore the validity of assessments of video recorded procedures performed by nonspecialist raters. Methods: This study was a blinded observational trial. Twenty-three novices (senior medical students) and 9 experienced doctors were video recorded while each performing 2 flexible cystoscopies on patients. The recordings were anonymized and placed in random order and then rated by 2 experienced cystoscopists (specialist raters) and 2 medical students (nonspecialist raters). Flexible cystoscopy was chosen as it is a simple procedural skill that is crucial to master in a resident urology program. Results: The internal consistency of assessments was high, Cronbach's α = 0.93 and 0.95 for nonspecialist and specialist raters, respectively (p < 0.001 for both correlations). The interrater reliability was significant (p < 0.001) with a Pearson's correlation of 0.77 for the nonspecialists and 0.75 for the specialists. The test-retest reliability showed the biggest difference between the 2 groups, 0.59 and 0.38 for the nonspecialist raters and the specialist raters, respectively (p < 0.001). Conclusion: Our study suggests that nonspecialist raters can provide reliable and valid assessments of video recorded cystoscopies. This could make mastery learning and competency-based education more feasible.
KW - competency-based learning
KW - Medical Knowledge
KW - nonspecialist raters
KW - Practice-Based Learning and Improvement
KW - procedural skills
KW - rater competencies
KW - rater training
KW - reliable assessments
UR - http://www.scopus.com/inward/record.url?scp=85023742630&partnerID=8YFLogxK
U2 - 10.1016/j.jsurg.2017.07.003
DO - 10.1016/j.jsurg.2017.07.003
M3 - Article
C2 - 28716383
AN - SCOPUS:85023742630
VL - 75
SP - 370
EP - 376
JO - Journal of Surgical Education
JF - Journal of Surgical Education
SN - 1931-7204
IS - 2
ER -