News Release

Training on virtual 'patient' improves carotid angiography skills

Simulator provides objective measurement of cardiologist performance

Peer-Reviewed Publication

American College of Cardiology

(BETHESDA, MD) – Cardiologists can learn to perform risky catheter procedures such as carotid angiography on a virtual patient simulator, rather than on real patients, according to a new study in the May 2, 2006, issue of the Journal of the American College of Cardiology.

"Virtual reality simulation technology has advanced to the point where we can actually use a virtual environment and have the trainee learn in a very 'patient-safe' way in a virtual patient environment and make mistakes on a virtual patient versus doing it on a real patient," said Christopher U. Cates, M.D., F.A.C.C., F.S.C.A.I. from the Emory University School of Medicine in Atlanta, Georgia.

Twenty interventional cardiologists participating in the Emory NeuroAnatomy Carotid Training program underwent an instructional course on carotid angiography and then performed five serial simulated carotid angiograms on the Vascular Interventional System Training (VIST) VR simulator.

The cardiologists committed fewer catheter errors, while performing the virtual procedure in less time, and subjecting the virtual patient to less X-ray imaging and smaller injections of contrast agent during the final run compared to the first one.

Dr. Cates noted that the study looked at only one specific model of simulator, the Procedicus Vascular Interventional System Trainer (VIST) made by Mentice AB in Gothenburg, Sweden. The company was not involved in the design or funding of this study. Dr. Cates said the performance of other simulators is not necessarily the same and would have to be studied individually before being used to measure the performance of doctors.

Trainees using the simulator use catheter controls that are identical to those on the ends of the catheters they use in actual procedures, but the other end of the catheters interact with sensors that feed movement data into a powerful computer. The trainees feel the "catheter" move and they watch the progress of the "tip" on a monitor image just like the fluoroscope X-ray image they would watch during and actual procedure. Cardiologists who have used the simulator say that what they see and feel is very realistic.

Carotid angiography, and the related procedure of carotid stenting, is a technically challenging procedure. The procedure involves threading thin catheters through blood vessels into a carotid artery in the neck. It is performed by a relatively small number of cardiologists, in part because of its inherent risks.

"In carotid angiography, where we are introducing catheters into the blood vessels that feed the brain, if a little piece of material breaks off or you do it incorrectly and knock a piece of blood clot or atherosclerotic plaque off the artery while you are putting the catheter in, it goes downstream and goes to the brain and causes a stroke. And that's a devastating event," Dr. Cates said.

Previously, practitioners learning new catheter procedures practiced on animals, cadavers or mechanical models and then were supervised as they worked on their first live patients. The researchers are currently doing studies to see if the patients of practitioners trained on this simulator have better clinical outcomes. But the researchers say one advantage of simulator training is already apparent. The progress of trainees (their "learning curve") is tracked objectively, so evaluators don't have to rely on the subjective reports of an instructor.

This study is the first to actually measure the "learning curve" of doctors trying a new procedure, documenting increasing proficiency and declining error rates of individual trainees.

"And so we now have some objectivity in how doctors are doing in their training versus the subjectivity of a mentor looking over the shoulder of a trainee doing the procedure and saying, 'I think he's doing a pretty good job,'" Dr. Cates said. "We can actually measure the doctor's performance doing the fine tasks in a simulator on a virtual patient and measure his task achievement against a benchmark, say of somebody who is expert in that technique, and show that he can reach that level of proficiency before he actually works on a patient for the first time. And that is a historic breakthrough in medicine."

Dr. Cates predicted that simulator training will become as routine in medicine as it already is in the airline industry and other fields.

"What we are seeing is a paradigm shift in the way we train physicians in procedural-based medicine, from looking over the shoulder of a doctor working on a real patient to where we are going to be able to measure the trainee's learning curve in a virtual environment and a 'patient-safe' environment, and make sure the doctor has reached a level of competence before he then works on his first patient," Dr. Cates said.

William A. Gray, M.D. from Columbia University in New York, New York, who was not connected with this study, said that although simulation is increasingly used in medical training, there is still a need to do basic validation studies to demonstrate repeatability and reliability of results.

"This article is a nice building block in the larger construct of validating simulation. The investigators have successfully demonstrated that physicians can not only be trained on a simulator; but also that the effects of training can be measured on a simulator through various metrics, including contrast volume, fluoroscope time, procedure time and so on. That is an important first step in the process of validation of simulation," Dr. Gray said.

Dr. Gray added that studies are needed that would look at whether simulators can distinguish between expert and novice practitioners. Also, he says it would be useful to know whether simulators can identify the specific strengths and weaknesses of individual physicians, in order to tailor further training and education.

###

Disclosure Box

Sources quoted in this news release do not report any potential conflicts of interest regarding this topic.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.