There are many aspects to a training simulator that can be considered when making the initial investment in simulation training. For robotic surgery, we believe the top factors to consider are:
- Validation studies conducted on and using the simulator
- Fidelity of the controllers
- Accessibility of the simulator
- Data, data, data!
Since Mimic launched its first version of the dV-Trainer in 2007, there has been a growing number of new robotic surgery simulators entering the market. The real impetus for simulation training was made clear in 2010 when Intuitive Surgical decided to launch their own Skills Simulator, a backpack-like addition for the da Vinci® Si platform.
Intuitive Surgical chose to license 27 exercises that Mimic had already developed or were in the process of developing especially for ISI. This was made possible by the new design of the system, which allowed for the console to operate independently of the patient side cart and core. Since 2010, both the ROSS Simulator from Simulated Surgical Systems and the Robotix Mentor from Simbionix (now 3D Systems) have entered the playing field.
The installed base of da Vinci® surgical systems is now over 3,500 systems around the world and close to 2,000 simulators have been installed and used to support this installed base. The majority of training simulators are da Vinci® Skill simulators (with Mimic’s licensed software) and close to 12% of robotic surgery simulators are Mimic’s dV-Trainers.
Our estimate is that over 70% of institutions performing robotic surgery have access to a simulator of some form or another and that close to 90% of robotic surgeons will at some point have tried a simulator. In fact, since 2007 we believe that between the dV-Trainer and the da Vinci® Skills Simulator over 6.25 million exercise sessions have been completed.
So has all of this simulation training activity been valuable you may ask? One way to look assess simulation training is through validation studies. There are currently five different ways of determining validity. Starting with the basics Face, Content, and Construct and moving to more valuable validation such as Concurrent and Predictive, the definitions are:
Face validity: Does the simulator have a realistic look and feel, compared to the actual surgical system?
Content validity: Is the simulator useful as a training tool for the surgical system?
Construct validity: Does the simulator have the ability to distinguish between Novice and Expert users?
Concurrent validity: How does the simulator compare to a similar or related construct (Dry Labs, Tissue Lab, etc.) carried out on the real robotic surgical system?
Predictive: validity: Can the simulator be used to predict actual performance in the O.R.?
Face and Content are of relatively low value as they are subjective and the most highly valued validation studies are Construct and Predictive validity. The table below shows the number of papers that have been published on various types of validation. As you can see there have been over 30 papers published on Mimic software either on the dV-Trainer or the da Vinci® Skills Simulator platform.
Recently, simulation was a large part of the discussion at the FDA town hall meeting in Washington. Roger Smith from Florida Hospital presented a comparison of the different simulators led by himself (the table above is adapted from his presentation). The data presented was clear that the most focus in researching the simulators was on the controllers and how close they emulated the real robotic surgeon’s console. Obviously, the da Vinci® Skills Simulator, which uses the real console is the real thing. However for the other simulators, this is where concurrent validity because extremely important, as essentially you are replicating (using the simulator) the same activity a surgeon would be doing on the real robotic surgical system.
A direct head to head study was done by Prof. Jacques Hubert and his team at the STAN Institute in Nancy, France between Mimic’s dV-Trainer and the da Vinci® Skills Simulator. During the study, participants completed the same exercises on the both systems and researchers found that on average there was only a 3% difference in overall score between the two systems. (89.9% vs 86.8%). This varied by the type of exercise but remained consistent with some internal bench-marking carried out by Mimic. No studies have been done to the same extent on the Ross and Robotix Mentor systems.
Another component to take in consideration when choosing a robotic surgery simulator is the accessibility to the system. While the great thing about the da Vinci® Skills Simulator is that it uses the real console, this can also be very detrimental and a negative for the da Vinci® Skills Simulator that it uses the real console. Very few hospitals can afford to have a dedicated console outside the OR that is used purely for training and simulation. If an institution is lucky enough to have a dual console system they will have the simulator on the second console but that is still kept in the OR. The value of the second console is in allowing programs with residents to keep training new surgeons without interrupting the flow and efficiency of the OR. Data shows that simulation systems in the OR are used less than systems outside the OR. This is due to the simple fact that as robotic programs become more successful and utilization increases there is just not enough time for training.
All things considered, any learning experience is only as good as the objectives and goals that are being set for the student and how well they are being tracked. The MScore system allows tailored pass marks, proficiency levels and curricula to be set for the students based on their learning objectives. A multitude of metrics and data can be reviewed to allow a student to learn from their mistakes and improve their psychomotor skills.
So when looking for a simulator, make sure to find one that is validated, has high fidelity controllers, can be accessed 24/7 outside the OR, and has a flexible management and scoring system that can be tailored to meet your learning objectives. In the Tanaka study that was referred to in Roger Smith’s presentation to the FDA meeting, an observation was made that while the majority of study participants preferred the usability of the da Vinci® Skills Simulator, 70% felt the dV-Trainer was the best value for money spent when taking all things into consideration.