VOLUME 31, NUMBER 30 THURSDAY, May 4, 2000
ReporterTop_Stories

Research goes beyond doctor's touch
UB develops virtual-reality glove allowing doctors to store what they feel in patient exam

send this article to a friend

By ELLEN GOLDBAUM
News Services Editor

A doctor's hands are two of the most important diagnostic tools he or she has, allowing the physician to detect subtle signs of disease or injury just by touching a patient.

Exercising that expertise has always required the presence of two individuals in the same physical space at the same time: doctor and patient.

Until now.

UB researchers are developing a system that will allow physicians to use a new form of virtual reality, called physically based VR, to store information about what they are feeling during an exam and to go back and review it later after the patient has gone, or to share it with consulting physicians in a remote location.

They will report on the progress of their work in July at the World Congress on Medical Physics and Biomedical Engineering in Chicago.

With this "Virtual Human Model for Medical Applications," physicians will wear a customized virtual-reality glove during the patient examination that collects data on what the physician is feeling through sensors located in the glove's fingertips. James Mayrose, doctoral candidate in the Department of mechanical and aerospace engineering, a senior designer of the glove and co-investigator on the project, is carrying out studies of it with human subjects at the Erie County Medical Center.

Thenkurussi Kesavadas, assistant professor of mechanical and aerospace engineering, director of UB's VR Lab and lead investigator, explained that right now, there is no way that a physician at a second site can share in the experience without personally examining the patient. In very serious cases, particularly when a patient has been diagnosed at a small, rural hospital, for example, the patient may have to be airlifted to a more comprehensive medical facility where he or she can be examined in person.

The VR system under development at UB could make some of those costly-not to mention traumatic-airlifts unnecessary.

"Using our customized data-collection glove and the detailed understanding we are developing about the physics behind a doctor's touch during an exam, we expect within two to three years to have a device in use that will allow a physician to use medical palpation virtually and in real-time," said Kesavadas

The UB work represents a departure from the usual route taken by researchers studying VR for use in medical situations, he added.

"Just about everyone who is looking at virtual medicine right now is interested in surgical applications," he said.

But those applications are many years away from being realized.

For his part, Kesavadas sees no reason to wait to reap the benefits of VR for diagnostics.

As is the case with many other virtual-reality applications for medicine, UB's Virtual Human Model will be relevant for training physicians as well.

The UB group currently is modeling on the computer the soft tissue and organs of the human abdomen, using atomic-unit type modeling that breaks up human tissue into pieces that each measure no more than 8mm.

The system takes as its raw material the Visible Human Data Set developed by the National Institutes of Health, which makes available to researchers complete digitized data sets of the human body.

Using a very powerful graphics computer, the researchers "supersample" smaller and smaller sections of the data set for a given body part or organ, which enables them to get more and more detailed pictures of each one, developing increasingly complex equations about how each tiny section will respond to applied forces. They then create layers of these sections, gradually building the collection of samples up into the complete organ.

"Our big contribution is that we are writing algorithms to model how soft tissue deforms as a real mass, rather than just as a surface, which is what many groups are currently doing. No one else is doing this in real time," said Kevin Chugh, a doctoral student in mechanical and aerospace engineering, who is a co-author on the research.

"We will be able to touch the model with a haptic thimble (the physically based VR counterpart of a computer mouse) on the screen, apply the "force," using a 'haptics' feedback system and show how it deforms and then bounces back when the force is withdrawn."

The work is based on a solid understanding of the physics behind what happens when pressure is applied to different parts of the human body.

"While the physician is doing a palpation on a patient, the computer-through the VR glove-is picking up all the information about what anatomic force characteristics the doctor's finger is feeling," said Kesavadas.

According to Kesavadas, only a handful of groups in the United States are doing atomic-unit modeling for an interactive VR environment.

"The advantage of our system is that the physician can store data that describe and quantify the sensation he is feeling in his fingers while he is examining a patient," said Kesavadas. "He can tell through touch if there are any diseased organs, if they are enlarged, or hard or soft, and if there are tumors present."

The system will have application with emergency services in the field, as well as military and battlefield applications.

The project is being funded through the Center for Transportation Injury Research of CUBRC, the Calspan-UB Research Center




Front Page | Top Stories | Briefly | Electronic Highways | Kudos | Q&A
Sports | Exhibits, Jobs, Notices | Events | Current Issue | Comments?
Archives | Search | UB Home | UB News Services | UB Today