Future Tense

Paralysis Patients Move and Even Feel Robotic Hands Through Brain-Computer Interface

There is amazing work going right now with computerized prosthetic limbs that are controlled through inputs from an amputee’s muscles or even nerves. There are also advances in the sophistication of humanoid robots in real-world conditions. But at the Defense Advanced Research Projects Agency’s aptly named “Wait, What?” conference last week, program manager Justin Sanchez presented some staggering next-gen neurotechnology research.

Sanchez’s project uses a brain-computer interface to allow paralysis patients to nimbly manipulate a robotic arm and hand with their minds, moving it in 3-D space, shaking hands, fist bumping, etc. DARPA sponsored surgeries on two patients, Nathan and Jan, to implant microelectrode arrays in the parts of the brain involved with sense and movement. Sanchez explains in the presentation that the research team was investigating “the brain’s role in the generation of movement and sensation as demonstrated by some of the very first people to ever be fitted with a direct brain interface.”

And that’s exactly where this demonstration gets really amazing. Not only is there video of Jan controlling the limb with her mind; there is a whole other component where we see Nathan receiving sensory information from the robotic hand. (Update, Sep. 15: Jan only has implants for movement control, while Nathan has implants for both movement and pressure sensation.) Even when blindfolded he can feel and accurately report which finger someone is touching—even two fingers at once. “We took the next step and we asked the question can we run the experiment in reverse and do for sensation what we did for the motor system?” Sanchez says. Sensors on the robotic hand’s fingertips measure forces and convert them into electrical signals that go to Nathan’s brain and allow him to feel.

It’s not that there is no precedent or context for this research. For example, my colleague Will Oremus reported on Slate in 2013 about a rhesus macaque named Oscar who was controlling the movements of a digital ball on a computer screen through a brain-computer interface. As Oremus wrote at the time, “The computer isn’t reading his mind, exactly—Oscar’s own brain is doing a lot of the lifting, adapting itself by trial and error to the delicate task of accurately communicating its intentions to the machine.”

This is true of Jan and Nathan as well, but watching the progress of this technology, you might actually start feeling giddy. There are the usual caveats that progress is slow and soldiers aren’t going to be controlling drones with their thoughts any time soon, but come on. Nathan can feel the robotic hand. That’s pretty legit.