INTRODUCTION: Asking participants to rate their own performance during unsupervised training in laparoscopy is reliable and may be cost-effective. The objective of the study was to explore the reliability of self-rated examinations where participants rate their own performance and decide themselves when they have passed tasks in basic laparoscopic skills.
METHODS: This prospective observational study was conducted at the Copenhagen Academy for Medical Education and Simulation where simulation-based laparoscopic skill training is offered. Here, participants taking part in a basic laparoscopic skills course were asked to rate their own performance and decide when they had passed the Training and Assessment of Basic Laparoscopic Techniques test. To explore reliability, all examinations were video recorded and rated by a blinded rater after the end of the course.
RESULTS: Thirty-two surgical trainees participated in the course, and 28 completed the study. We found a high reliability when comparing self-rated scores and blinded ratings with an intraclass correlation coefficient of 0.89 (P < 0.001); self-rated scores compared with blinded ratings were not significantly different (mean = 451 vs. 455, P = 0.28), and the participants did not underestimate nor overestimate their performance.
CONCLUSIONS: Ratings from self-rated examinations in a basic laparoscopic skills course are reliable, and participants neither underestimate nor overestimate their performance. Self-rated examinations may also be beneficial because they also can offer a cost-effective approach to assessment of surgical trainees.