A new Israeli-developed tool enables the disabled to send emails by thought alone, and could revolutionize the world of mind-controlled computing.


Photo by Dani Machlis, Ben-Gurion University
Ben-Gurion University of the Negev students Uri Usami, Ariel Rosen and Ofir Tam developed a way for people to control computer actions with their thoughts.

The able-bodied can blast off an email in seconds, while eating a sandwich and calming a baby. But those less deft with their digits, such as the disabled, may not even be able to email let alone do those side tasks.

Hoping to give more dignity and communications possibilities to the disabled, a trio of students from an Israeli university developed a program that connects brain waves virtually to a computer interface. They call it MinDesktop, and their prototype application could revolutionize mind-controlled computing the same way Windows changed the accessibility of personal computing.

Taking an off-the-shelf technology, the three Ben-Gurion University of the Negev undergraduates have developed a new graphical user interface (GUI) to help the physically challenged use their thoughts to send emails, surf the Web, turn on media players and communicate with their computer and the outside world.

Following successful trials with 17 able-bodied test subjects, Uri Usami, Ofir Tam and Ariel Rosen hope their product will be applied one day to help the disabled with actions beyond the computer screen. Project supervisors see the usefulness of such an interface for other purposes as well, such as in noisy environments or situations where two hands are just not enough.

“One application is helping disabled people with diseases like ALS and other muscular problems, or for somebody who is using his hands for some other operation and cannot use a keyboard or mouse,” says Dr. Rami Puzis, one of the supervisors. “Of course you could use voice also, but this application could be especially useful in noisy environments. With pilots, for instance.”

Astronauts, he agrees, could also benefit.

Programmed with thoughts you love

Under the supervision of Puzis, Prof. Yuval Lovitz and Dr. Lior Rokach, MinDesktop has a long way to go until it becomes a commercial reality, says Puzis, explaining how the idea came about: “Dr. Lior Rokach became aware of the hardware Emotiv, a headset that can record and analyze [brainwave] EEGs – and he wanted to start a student engineering project that would do something with it.”

When offered the idea, the students had their own plans in mind, and veered away from what their supervisors had originally intended, says Puzis.

In a series of experiments, able-bodied subjects learned a new action in eight seconds and then typed a 12-character email in about seven minutes. However, Puzis believes that trained users could finish sending a sentence as simple and significant as “I love you, Mom” in as few as four minutes, using only — you guessed it — their mind.

The breakthrough is in the way the students hierarchically organized the system’s commands in a way that is simple for a human to learn and use.

“The innovation is the user interface — the human-computer interface. We are using three actions that the software and the headset can give us – two is not enough – and when there are many actions you define [in the system], it becomes noisy and harder to control,” says Puzis.

“Instead of a mouse and keyboard, we have a headset and an easy interface to a pointing device that can be controlled.”

Fitted with a helmet developed by Emotiv, which comes equipped with 14 points to record and analyze EEG brainwaves, the students programmed the helmet’s existing software to learn the thoughts associated with simple types of actions such as push and pull — actions that are associated with specially developed commands for the mobility-challenged.

A new window of accessibility

The invention doesn’t allow a free-thought exchange between user and PC, but uses a kind of mechanized learning of commands to associate a thought with a certain onscreen action.

For instance, you could train the computer to learn a pushing action, such as pushing a button on a screen or media player, by associating that action with a familiar thought — let’s say, the bark of your dog.

After the computer learns the brainwaves associated with you thinking about your dog’s bark, in every future instance when you think of your dog’s bark an associated pushing action will take place as your thoughts are picked up on the Emotiv headset, translating them to an action.

Such a thought-controlled computer has been the challenge for hundreds of labs around the world, but until now the ideas haven’t been altogether feasible for the masses due to the clunkiness of the equipment and the need for it to be operated in a lab setting.

Prof. Mark Last, who heads the software engineering program at the university, says that he didn’t see any limitations to using the system — even “interference” such as thick, curly hair does not pose a problem.

He explains how the students approached the programming: “When you have some history of the brainwaves – and you have a set of thoughts that you ask your subject to think about, then you can look for relationships between certain thoughts and those actions so you can recognize correctly what the subject was thinking about.”

From there, the computer can make predictions – and help the disabled listen to the Rolling Stones, or send an email to a loved one.