Tufts University
Search:

Just a Touch

The new Center for Scientific Visualization will help Assistant Professor Caroline Cao develop more effective surgical simulation tools.


Caroline Cao, assistant professor of mechanical engineering, is investigating methods of simulating minimally invasive surgery that utilize visualization in conjunction with haptics, the sense of touch, to incorporate force feedback into surgical skills training. Her focus is haptics and visualization for laparoscopic surgery, a minimally invasive type of surgery that utilizes a video camera and specialized surgical instruments. Currently, says Cao, most surgical simulators do not offer realistic force feedback.


Center for Scientific Visualization Multimedia: Senior Kyle Maxwell demonstrates the application of the Center for Scientific Visualization for the virtual reality surgical simulation he is working on with Assistant Professor Caroline Cao. Watch video.


"In teleoperation, force feedback or haptic feedback is very important," says Cao, who is the director of the Human Factors program in the School of Engineering. "Otherwise you don't feel what it is that you're dealing with. You end up either colliding into your targets or you don't know how to control the forces in order to manipulate your targets."

In a real-life scenario, that could result in damaged tissue or other surgical complications. With an overall increased focus on patient safety, there is greater pressure to reduce surgical errors. This is where realistic simulations could help save lives and money down the road.

"If you can train the surgeons to perform using limited haptic feedback, as in current minimally invasive surgery, then the skills can transfer to real life," she says. "Better yet, we want to be able to create a realistic scenario with realistic tissue interaction forces. So we need to know what kind of forces are being applied in laparoscopic surgery."

Cao's simulation using a haptics device in conjunction with the resources at the Center for Scientific Visualization. "The VisWall will help us study how surgeons use the force information we provide, and how best to design the feedback so that it's actually useful for training purposes," says Cao.

Her work is focused on creating a virtual reality simulation of the surgical environment.

Senior computer science major Kyle Maxwell has been working with Cao to develop deformable 3D graphical objects for use in the simulation. In developing his deformable models, Maxwell has been working with Professor of Computer Science Sarah Frisken, whose area of expertise is computer graphics and modeling.

Center for Scientific Visualization

"It's pretty complicated to model how a soft body interacts with something, so I'm looking at different approaches for modeling deformations, and then trying to create a realistic scene that will accurately model laparoscopic surgery," explains Maxwell.

In the simulation developed by Cao and Maxwell, the user receives the appropriate feedback when using a haptics device to touch a hard surface, such as a tumor or bone, and a soft, deformable surface, such as tissue. The reaction is determined by the parameters provided by the model, which is based on experimental data. In the 3D environment, the virtual surgical tool can move left and right, but also back and forth, as well as rotate.

"What's happening is the tool is telling the computer where it is, and then the computer says, you're hitting the floor, and then in the tool, there's a motor that freezes up and resists you so you can't go through the floor," explains Maxwell.

The response of the simulation can be customized depending on which tool the surgeon is using, whether it is a simple probe or an electrocautery device.

Computational processing speed can affect the quality of the simulation, which uses data gathered on tissue response to simulate a realistic reaction. While accuracy is difficult due to the variables involved, projects such as this provide a starting point and an opportunity for experimentation and refinement. And the more data, the better.

"The difficult thing is getting this at a high enough resolution to be accurately deformable, so that's where kind of the algorithms and computer science comes in, in terms of getting it to run in real time," says Maxwell. "The equation is based on the material properties of the object, so you can actually adjust the stress and the strain response to get things to feel differently."

The work, which is funded by a Research Experience for Undergraduates Supplement grant to Cao’s Career Award from the National Science Foundation, is aided by the VisWall.

"With the VisWall you can obviously get higher resolution, but you also get a sense of presence with the immersion, because it covers a bigger visual field."

— Caroline Cao

"With the VisWall you can obviously get higher resolution, but you also get a sense of presence with the immersion, because it covers a bigger visual field," says Cao.

Cao also sees the VisWall, in conjunction with haptics, being extended for other surgical applications, such as to track the path of a colonoscope, giving a doctor a chance to virtually travel through the colon in performing a colonoscopy procedure.

"It's almost like 'Space Odyssey,'" says Cao. "You're walking through the tunnel and you can feel and test and see whether you're dealing with a tumor or not. That would be cool."

Cao's research is just one of several projects bringing people together around the new VisWall.

"All the people who are thinking about using the VisWall are a community that wasn't really brought together before," says Maxwell. "So you've got all these people who are interested in visualization and you can probably have a lot more collaboration and interesting projects."


Profile written by Georgiana Cohen, Web Communications. Video by Tufts Educational Media Center.

Photos by Joanie Tobin, University Photography

This story originally ran on Feb. 18, 2008.