Validation Study of a Virtual Reality Robotic Simulator—Role as an Assessment Tool?
Jason Y.Lee, Phillip Mucksavage, David C.Kerbl, Victor B.Huynh, Mohamed Etafy, Elspeth M.McDougal
Virtual reality simulators are often used for surgical skill training since they facilitate deliberate practice in a controlled, low stakes environment. However, to be considered for assessment purposes rigorous construct and criterion validity must be demonstrated. We performed face, content, construct and concurrent validity testing of the dV-Trainer™ robotic surgical simulator.
Material and Methods
Urology residents, fellows and attending surgeons were enrolled in this institutional review board approved study. After a brief introduction to the dV-Trainer each subject completed 3 repetitions each of 4 virtual reality tasks on it, including pegboard ring transfer, matchboard object transfer, needle threading of rings, and the ring and rail task. One week later subjects completed 4 similar tasks using the da Vinci® robot. Subjects were assessed on total task time and total errors using the built-in scoring algorithm and manual scoring for the dV-Trainer and the da Vinci robot, respectively.
Seven experienced and 13 novice robotic surgeons were included in the study. Experienced surgeons were defined by greater than 50 hours of clinical robotic console time. Of novice robotic surgeons 77% ranked the dV-Trainer as a realistic training platform and 71% of experienced robotic surgeons ranked it as useful for resident training. Experienced robotic surgeons outperformed novices in many dV-Trainer and da Vinci robot exercises, particularly in the number of errors. On pooled data analysis dV-Trainer total task time and total errors correlated with da Vinci robot total task time and total errors (p = 0.026 and 0.011, respectively).
This study confirms the face, content, construct and concurrent validity of the dV-Trainer, which may have a potential role as an assessment tool.