Foell 2013

<< Return to Studies

Multidisciplinary Validation Study of The Da Vinci Skills Simulator: Educational Tool and Assessment Device

Kirsten Foell, Alexander Furse, R. John D’A. Honey, Kenneth T. Pace, Jason Y. Lee

Abstract

Despite the increased dexterity and precision of robotic surgery, like any new surgical technology it is still associated with a learning curve that can impact patient outcomes. The use of surgical simulators outside of the operating room, in a low-stakes environment, has been shown to shorten such learning curves. We present a multidisciplinary validation study of a robotic surgery simulator, the da Vinci® Skills Simulator (dVSS). Trainees and attending faculty from the University of Toronto, Departments of Surgery and Obstetrics and Gynecology (ObGyn), were recruited to participate in this validation study.

All participants completed seven different exercises on the dVSS (Camera Targeting 1, Peg Board 1, Peg Board 2, Ring Walk 2, Match Board 1, Thread the Rings, Suture Sponge 1) and, using the da Vinci S Robot (dVR), completed two standardized skill tasks (Ring Transfer, Needle Passing). Participants were categorized as novice robotic surgeon (NRS) and experienced robotic surgeon (ERS) based on the number of robotic cases performed. Statistical analysis was conducted using independent Ttest and non-parametric Spearman’s correlation. A total of 53 participants were included in the study: 27 urology, 13 ObGyn, and 13 thoracic surgery (Table 1).

Most participants (89 %) either had no prior console experience or had performed <10 robotic cases, while one (2 %) had performed 10–20 cases and five (9 %) had performed ≥20 robotic surgeries. The dVSS demonstrated excellent face and content validity and 97 and 86 % of participants agreed that it was useful for residency training and post-graduate training, respectively. The dVSS also demonstrated construct validity, with NRS performing significantly worse than ERS on most exercises with respect to overall score, time to completion, economy of motion, and errors (Table 2).

Excellent concurrent validity was also demonstrated as dVSS scores for most exercises correlated with performance of the two standardized skill tasks using the dVR (Table 3).

This multidisciplinary validation study of the dVSS provides excellent face, content, construct, and concurrent validity evidence, which supports its integrated use in a comprehensive robotic surgery training program, both as an educational tool and potentially as an assessment device.

Click here for the full study