Doctors at a hospital in Washington DC have performed two in-vivo brain biopsies using a new hand gesture recognition system developed by Israeli researchers that allows physicians to manipulate digital images during medical procedures with just the flick of a wrist.

The new system (Gestix) was developed by researchers from Ben Gurion University of the Negev (BGU), and is designed to enable doctors in the operating room to change digital images with a hand movement, rather than by touching a screen, keyboard or mouse – all of which compromise sterility and could spread infection.

Infection rates at hospitals are today unacceptably high, according to Juan P. Wachs, the lead researcher in the project, and a recent Ph.D. recipient from the Department of Industrial Engineering and Management at BGU.

Reducing hospital infection

According to the Centers for Disease Control and Prevention (CDC), one in 20 patients develop infections while being treated in US hospitals every year – some two million people – of which 100,000 die of the infection. This is more than the figure for deaths related to auto accidents and homicides combined, and adds billions of dollars to the health care bill.

The CDC also announced that the prevalence of Methicillin-resistant Staphylococcus aureus (MRSA) acquired at US hospitals is much higher than previously suspected. In a 2005 study, the CDC found that almost 95,000 Americans developed this superbug infection, and nearly 19,000 of them died as a result. MRSA spreads from patient to patient through contact with doctors, nurses, contaminated gloves and medical equipment.

“A sterile human-machine interface is of supreme importance because it is the means by which the surgeon controls medical information, avoiding patient contamination, the operating room (OR) and the other surgeons,” said Wachs.

“This could replace touch screens now used in many hospital operating rooms which must be sealed to prevent accumulation or spreading of contaminants and requires smooth surfaces that must be thoroughly cleaned after each procedure – but sometimes aren’t,” he added.
Learning the surgeon’s hand gestures

Gestix works in two stages. In the first calibration stage, the machine learns to recognize the surgeon’s hand gestures. In the second, surgeons must learn and implement eight navigation gestures, rapidly moving the hand away from a ‘neutral area,’ and back again.

“Gestix users even have the option of zooming in and out by moving the hand clockwise or counterclockwise,” said Professor Helman Stern, a principal investigator on the project, and a professor in BGU’s Department of Industrial Engineering and Management.

To avoid sending unintended signals, users may enter a ‘sleep’ mode by dropping the hand. Gestures are captured by a Canon VC-C4 camera, positioned above a large flat screen monitor, using an Intel Pentium and a Matrox Standard II video-capturing device.

Wachs and his research team took two years to develop the Gestix project. In the first year, Wachs developed the system while he was an informatics fellow at the IMI, in Washington DC. In the second year, Wachs continued working on the project at BGU with Prof. Stern, and Yael Edan, the project’s principle investigators.

The first trials of the hand gesture recognition system were reported this month in the Journal of the American Medical Informatics Association. They took place at the Washington Hospital Center, and were successfully implemented in in-vivo neurosurgical brain biopsies.