Engineers from the Rensselaer Polytechnic Institute, in cooperation with Harvard Medical School, Albany Medical Center, and the Massachusetts Institute of Technology, are developing a novel type of surgical simulator for surgeons training in minimally invasive surgery. The haptics (i.e. sense of touch) used in the simulator are based on a computerized algorithm called Point-Associated Finite Field:
“The sense of touch plays a fundamental role in the performance of a surgeon,” De said. “This is not a video game. People’s lives are at stake, so when training surgeons, you better be doing it well.” [Suvranu De is an assistant professor of mechanical, aerospace, and nuclear engineering and director of the Advanced Computational Research Lab at Rensselaer –ed.]
In a paper published in the June/July issue of the journal Presence, the researchers describe their new computational technique, and beginning in the summer of 2006 the work will be supported by a $1.4 million, four-year grant from the National Institutes of Health (NIH)…
Surgical simulators — even more than flight simulators — are based on intense computation. To program the realism of touch feedback from a surgical probe navigating through soft tissue, the researchers must develop efficient computer models that perform 30 times faster than real-time graphics, solving complex sets of partial differential equations about a thousand times a second, De said.
The major challenge to current technologies is the simulation of soft biological tissues, according to De. Such tissues are heterogeneous and viscoelastic, meaning they exhibit characteristics of both solids and liquids — similar to chewing gum or silly putty. And surgical procedures such as cutting and cauterizing are almost impossible to simulate with traditional techniques.
To overcome these barriers, De’s group has developed a new computational tool called the Point-Associated Finite Field (PAFF) approach, which models human tissue as a collection of particles with distinct, overlapping zones of influence that produce coordinated, elastic movements. A single point in space models each spot, while its relationship to nearby points is determined by the equations of physics. The localized points migrate along with the tip of the virtual instrument, much like a roving swarm of bees.
This method enables the program to rapidly perform hundreds of thousands of calculations for real-time touch feedback, making it superior to other approaches, according to the researchers. “Our approach is physics-based,” De said. “The technologies that are currently available for surgical simulation are mostly graphical renderings of organs, and surgeons are not very happy with them.” And the same physics-based technology can be used to model blood flow and the generation of smoke during cauterization, which is often used to burn tissue and stop hemorrhaging.
The researchers are currently using video images of actual surgical procedures to enhance the visual realism of their computer-generated scenarios, and they are performing experiments on human cadavers to evaluate the mechanical properties of human organs. These experiments are taking place at Albany Medical Center in collaboration with Tejinder Paul Singh and Leon Martino, and also at Connecticut-based U.S. Surgical, a manufacturer of wound closure products and advanced surgical devices.