How To Talk When You Can't Speak
Communicating with unconscious minds.
This week, Neurology published an unsettling study of two brain-damaged men who are "minimally conscious"—able to breathe on their own but otherwise generally unresponsive. When neuroscientists scanned the patients' brains as they played audiotapes of loved ones, the activity was strikingly normal. The visual cortex of one of the men even lit up in a way that suggested he was visualizing the stories that his relatives told. One of the researchers told the New York Times that they've repeated the experiment on seven more patients and found the same results.
If the study holds water, we may need to rethink how we treat the estimated 300,000 Americans who are regarded as unreachable. The good news is that there are ways to communicate with some patients who seem completely unconscious. Spying into the brains of the unresponsive—as well as the "locked in," patients who are fully conscious but paralyzed by diseases such as ALS—can create a vehicle for them to talk. This conceit is at the heart of brain-computer interfacing, a booming field in which scientists are crafting tools that translate mental activity into keystrokes, mouse movements, and even robotic control.
Now for the bad news: Brain-damaged, "minimally conscious" patients like the ones in the Neurology study may be so impaired that they're unable to communicate with the outside world. Neurologists can usually figure out for sure if the mind of a locked-in patient is functioning well; the challenge is in setting it free. Doctors have a harder time figuring out the mental state of brain-damaged patients, especially if they can't open their eyes. Since most of the brain interfaces that are in development require the subject's eyes to be open wide, a patient whose eyes are shut—at least for now—is pretty much rendered mute.
One promising technique for unlocking the thoughts of paralyzed patients is to hook them up to electroencephalograms. EEGs read the electrical impulses caused by brain activity, including the "P300 wave," something like an involuntary "aha" response. When you're looking at a set of items and see something you suddenly recognize, your brain automatically kicks out an electrical spike 300 milliseconds later. You don't have to think about it; it just happens.
Psychologists Lawrence Farwell and Emanuel Donchin have turned this response into a rudimentary typing machine. The patient gets hooked up to an EEG, then looks at a computer screen that shows a six-by-six grid of the letters of the alphabet. When he focuses on a certain letter, the computer begins highlighting each column. As the column containing the chosen letter comes up, the subject's brain spits out a P300 "aha" response. When the computer repeats the same thing with the rows and gets another "aha," it gets the X and Y coordinates for the correct letter. Using this technique, people with ALS can "type" about four letters per minute. Best of all, because the "aha" response happens automatically, they don't have to learn any new skills.
Other scientists have developed techniques to train subjects to signal the outside world by actively changing their brain activity. The German researcher Niels Birbaumer, who was profiled in TheNew Yorker in January 2003, has trained two locked-in patients in meditation techniques that let them control "slow voltage changes" in their cortexes. This meditation allowed Birbaumer's students to manipulate a pointer on a computer screen to pick out letters by a process of elimination. After weeks of training, one patient with ALS wrote a 78-word message in 16 hours, about two characters per minute. Birbaumer speeds this up a bit by using predictive technology that auto-completes a word after a few letters—precisely the same thing that helps you type more quickly on your phone.
None of this is quite touch-typing, but in a networked, gadget-strewn world it's not necessary to spell out entire words. Even a simple yes/no command can turn a light switch on or off. At the Georgia State University BrainLab, Melody Moore is working on "neural prosthetics" that could control everyday devices like lights and heating or help a patient pilot a wheelchair. And for those who want to explore the world outside the home, Moore is also developing a brain-controlled Web browser.
At this point, brain-computer interfaces have only been tested on a small number of patients. They also face huge barriers functioning outside a lab: Try to use an EEG machine in your living room, and it will probably catch interference from your television and stereo. What's more, these brain-interface tools are mentally tiring. If a patient hasn't used his brain for a long time, they might just be too taxing: Many doctors theorize that the brain's responses can atrophy like those of an unused muscle.
If these devices ever do become ready for prime time, they'll introduce some fascinating moral and legal shifts. The ongoing battle over the life and death of the supposedly brain-dead Terri Schiavo would play out quite differently if Schiavo herself could signal her desires, or even if we knew just a bit more about her mental state. The impact on politics could be even greater. How would President Bush react to a petition in favor of stem-cell research that's written and signed by 10,000 locked-in Americans?
Special thanks to Brendan Allison of the Georgia State University BrainLab, Emanuel Donchin of the University of South Florida, Jonathan Wolpaw of the Wadsworth Center, and Joy Hirsch of Columbia University.
Clive Thompson is a contributing writer for the New York Times Magazine and
a columnist for Wired.
Illustration by Robert Neubecker.