Around midnight on November 5, 1999, Erik Ramsey was a passenger in a friend’s Camaro that was in an accident with another vehicle and flipped and landed on an embankment. His injuries were devastating – a collapsed lung, a lacerated spleen, a ruptured diaphragm, ripped tendons in his hand, and a femur that was broken in two places. More so, a blood clot had caused a brain-stem stroke that cut the connection between his mind and his body, a condition known to neurologists as locked-in syndrome.
He can still see, smell, and hear, his body could still register the itch of a rash or the pleasure of a warm breeze. But he cannot speak or make any voluntary movements other than with his eyes.
Help, though, is on the way, as reported in a fascinating article by Chris Berdik that was published in the Spring 2009 issue of Bostonia magazine. Ramsey’s ability to communicate since the accident is entirely dependent on those around him asking the right questions and interpreting the sometimes imperceptible shifting of his eyes. He can think of words, but cannot move his mouth and cannot speak. Boston University scientists, Frank Guenther, a College of Arts & Sciences professor of cognitive and neural systems, and Jonathan Brumberg, a postdoctoral research associate in Guenther’s lab, are working to create decoder software that they hope can translate thoughts into speech.
The speech prosthesis project puts Guenther and Brumberg at the forefront of brain-computer interface (BCI) research, a new field of science that offers hope for those with paralysis, amputated limbs, neurodegenerative diseases, or sensory impairments. They plan to implant electrodes under the scalp into the part of the motor cortex responsible for the movements involved in speech to capture the activity of about 40 neurons. The data is then transmitted wirelessly to a computer to be decoded. Ramsey is imagining speaking. By doing so, neurons fire and software looks for patterns and plots the results on a computer screen and generates the corresponding vowel sound. (Consonants are far more complicated than vowels.)
Functional magnetic resonance imaging (fMRI) technology allows Guenther to compare the models predictions to actual brain scans of people speaking words or syllables under particular constraints, such as restricted jaw movement or distorted auditory feedback.
Separately from Guenther and Brumberg’s work, sophisticated scanning technology is also being used to help people in vegetative states who do not lack some level of awareness, but rather the ability to communicate their awareness of themselves or their environment. Ramsey was the first person to have an electrode implanted in a brain region known to be involved in speech.
Electrodes had first been implanted in humans by Phil Kennedy, a pioneer in brain-computer interface research, which had allowed paralyzed individuals to move a computer cursor with their thoughts and to work with basic text and drawing applications.
The Bostonia article reports that speech prosthesis is just one of several BCI projects making headlines. In January 2008, scientists at Duke University were able to make a robot walk on a treadmill in Japan by transmitting neural activity from monkeys in North Carolina. A few months later, researchers at the University of Pittsburgh and Carnegie Mellon University trained monkeys to adopt brain-controlled robotic arms as their own, using them to feed themselves grapes and marshmallows.
Presently, nearly all of the mainstream BCI research is directed at helping the disabled. Guenther and Brumberg hope the speech prosthesis will in the future be used by those afflicted with the neurodegenerative disease amyotrophic lateral sclerosis (ALS) or who have lost their voice following throat cancer surgery. They have also been approached by those interested in creating BCI applications to enhance the capabilities of healthy people, mainly to boost memory.