Using simulators to enhance resident training is a strategy that many orthopaedic programs are investigating. But do proficiencies learned with a simulator transfer into skills during an actual surgery? A recent study published in The Journal of Bone & Joint Surgery (Nov. 5) seems to show that they do.
The study compared actual arthroscopic surgical skills between two groups of orthopaedic residents—one that underwent training on an arthroscopic knee simulator and a second that had conventional surgical instruction. According to author W. Dilworth Cannon, MD, and colleagues, the residents trained with the simulator performed significantly better in key measurement categories than those in the control group.
The blinded study involved 48 PGY-3 residents from seven academic institutions; each participating institution had at least 6 PGY-3 residents. Participating residents at each site were randomized to either the simulator or standard group.
Residents in both groups had similar previous experiences with arthroscopic surgery and continued their institution-specific education and training during the study. Both groups saw the same 15-minute video depicting the diagnostic knee arthroscopy procedure on a live patient and received both a handbook describing the procedure and a list detailing the procedural tasks.
The simulator used was the ArthroSim (Touch of Life Technologies [ToLTech], Aurora, Colo.) high-fidelity virtual reality arthroscopic knee simulator (Fig. 1). Residents trained on the simulator were required to meet the following criteria prior to performing an actual surgery:
- achieve a 100 percent score on each visualization and procedure task before proceeding to the next task
- complete a set of tasks within 150 percent of the average time taken by experienced surgeons to complete the same set of tasks
- achieve a proficiency score equal to at least 83 percent of the average score of community-based surgeons performing the same set of tasks
Within 2 weeks of meeting these goals, simulator-trained residents performed a diagnostic knee arthroscopy procedure on a live patient under the supervision of an attending surgeon. Residents in the control group could perform the procedure on live patients once a member of the simulator group completed the training phase.
The simulator resided at each institution for an average of 5.5 months. Simulator-trained residents took an average of 11 hours to reach or exceed the final benchmark of 83 percent proficiency.
In the OR
Attending surgeons were blinded as to the resident group. Residents were given 25 minutes to complete the surgery; if they exceeded that time, the attending surgeon took over and a score of zero was given.
All procedures were video-recorded for evaluation and scoring.
Two checklists were used, one for visualization and one for probing tasks. Scoring was weighted 1-to-2 in favor of probing, “because we felt that probing, which involves triangulation skills, is a more difficult procedure to learn than just visualization of anatomic structures,” Dr. Cannon said. In addition, performance was subject to evaluation by seven “global impression items,” covering areas such as degree of iteration of movement, how deftly and gently the scope or probe was used, and skill at positioning the tip of the arthroscope.
In each procedure, the attending surgeon initiated the standard two anterior portals and inserted the arthroscope into the suprapatellar pouch. Then the resident performed the visualization portion and probed the essential anatomic structures. The number of interventions needed from the attending surgeon was documented; if three or more interventions were required, the resident received a score of zero.
Overall, the simulator-trained group performed significantly better (P = 0.031) than the control group. The authors report that this difference was predominantly the product of better performance in the probing section (P = 0.016) rather than the visualization section (P = 0.34).
Surgical time was not significantly different between the two groups (16.2 minutes for simulator-trained residents and 15.5 minutes for conventionally trained residents). An average of one intervention by the attending surgeon per case was recorded for both groups. Similarly, both groups also received the same score on the number of tasks performed out of order.
“This result provides evidence that the control group indeed studied the video of the diagnostic arthroscopy and read the write-up of the procedure, and it dispels criticism that the experimental group, by training for an average of 11 hours, learned the order of the tasks of the procedure better than the control group did,” the authors write.
Although some arguments in favor of using simulators also address video game performance, particularly with regard to hand-eye coordination skills, these researchers found no such evidence. “Residents in the study were not serious video game players, and there was no statistical correlation with their surgical performance,” they report. Additionally, 87 percent of residents rated their hand-eye coordination skills as average or better than average.
Transfer of training—validated
“We have demonstrated transfer validity (transfer of training) in that third-year orthopaedic residents from seven academic institutions trained to proficiency on a high-fidelity virtual-reality arthroscopic knee simulator demonstrated a greater skill level in the operating room (OR) than non–simulator-trained residents did when assessed with a proprietary procedural checklist,” Dr. Cannon told AAOS Now.
“This paper could be the first to demonstrate ‘transfer of training’ from a simulator-based anatomic knee model to the real-world operating room. Based on the conclusions of this study,” he continued, “we think that simulation training on a realistic virtual-reality knee model to enhance knee arthroscopy skills should be part of residency training programs.”
Training with simulation, the authors write, “should result in residents making fewer surgical errors and should shorten the time needed to perform surgical procedures at academic institutions, where resident training slows the efficiency of the operating room. Proficiency-based training is likely to become a standard in the near future.”
In addition to this study’s findings on transfer validity, the rating system devised and used to evaluate performance may now be used in orthopaedic training programs as an objective evaluation method of resident performance.
Co-authors of the study are William E. Garrett Jr., MD, PhD; Robert E. Hunter, MD; Howard J. Sweeney, MD; Donald G. Eckhoff, MD, MS; Gregg T. Nicandri, MD; Mark R. Hutchinson, MD; Donald Johnson, MD, FRCS; Leslie J. Bisson, MD; Asheesh Bedi, MD; James A. Hill, MD; Jason L. Koh, MD; and Karl D. Reinig, PhD.
The study was funded by the AAOS, the Arthroscopy Association of North America (AANA), and the American Board of Orthopaedic Surgery (ABOS). Neither AANA nor ABOS played a role in the study itself.
Disclosure information: Dr. Cannon—TolTech, Wolters Kluwer Health–Lippincott Williams & Wilkins, Sports Medicine and Arthroscopy Review; Dr. Hunter—Smith & Nephew, Biomet, Breg, American Orthopaedic Society for Sports Medicine (AOSSM), Arthroscopy Association of North America (AANA), Arthroscopy; Dr. Sweeney—Life Spine Inc., AANA, AAOS; Dr. Nicandri—Arthrex; Dr. Hutchinson—American Journal of Sports Medicine, British Journal of Sports Medicine, The Physician and Sports Medicine, AAOS, American Board of Orthopaedic Surgery, American College of Sports Medicine; Dr. Johnson—Arthrex, Wolters Kluwer Health–Lippincott Williams & Wilkins; Dr. Bisson—KFx Medical, Arthrex, Cayenne Medical, Operative Techniques in Sports Medicine, Journal of Sports Medicine; Dr. Bedi—A3 Surgical, Smith & Nephew, Journal of Shoulder and Elbow Surgery, AOSSM; Dr. Hill—J. Robert Gladden Society; Dr. Koh—Aesculap/B. Braun, Aperion, Arthrex, AOSSM, AANA, Illinois Association of Orthopaedic Surgeons, Patellofemoral Foundation; Dr. Reinig— ToLTech; Dr. Eckhoff—no conflicts; Dr. Garrett—no information.
Terry Stanton is a senior science writer for AAOS Now. He can be reached at email@example.com
- This randomized, blinded study involved 48 PGY-3 residents at seven institutions who trained for knee arthroscopy either on a virtual-reality simulator or conventionally.
- Overall, the simulator-trained group performed significantly better on a live patient than the control group, with the predominant difference occurring in the probing section of the evaluation.
- The simulator group did not perform significantly better in the visualization portion, nor did it complete the procedure in significantly less time.
- The authors conclude that the transfer of training demonstrated in this study points to the value of simulator training and a rationale for its adoption in orthopaedic resident education.