Future Tense

Can We Really Upload Johnny Depp’s Brain?

A look at the science of Transcendence.

Johnny Depp in Transcendence.
Johnny Depp in Transcendence.

Photo courtesy Peter Mountain/Alcon Entertainment, LLC.

When Wally Pfister’s Transcendence is released on April 17, millions of moviegoers will be asking themselves, “Could we really upload Johnny Depp into a computer one day?” In the spirit of “Could Bruce Willis Save the World?” and “Gravity Fact Check,” we’d like to take Hollywood perhaps a bit too seriously and examine the scientific plausibility of what’s called “whole brain emulation.”

Whole brain emulation as depicted in Transcendence appears possible in principle. As a character from the movie’s first trailer says, the mind is “a pattern of electrical signals,” nothing more. That might be controversial among the general public, but it is the near-universal consensus among cognitive scientists. And in principle, that pattern of electrical signals could be run (“emulated”) on a computer rather than in that lump of meat inside a human skull.

We’ve already successfully emulated (much) simpler things. If you’re old enough, you might have played games like Space Invaders on the Atari 2600 gaming console like one of us (Luke) did. The Atari 2600 had a processor called the MOS 6502 that’s long since obsolete. But we can emulate it exactly inside a modern-day computer. (Researchers do this kind of thing for preservation purposes: Physical chips decay and warp over time, but an emulation is just information, and can be copied and preserved indefinitely.)

Here’s what we mean by “emulate”: When young Luke pushed the “shoot” button on the Atari joystick, an electrical signal traveled from the joystick to the MOS 6502 processor, which in turn controlled—according to strict rules—how other electrical signals moved from circuit to circuit inside the processor. All this activity eventually sent a certain pattern of electrical signals to his TV, where Luke could see that his spaceship had just fired a laser blast at the invading aliens.

By scanning the processor’s map of circuits in high resolution, and by knowing the rules of how those circuits work, we can reproduce the same functionality in a modern computer, without the physical MOS 6502 processor. In the emulated processor, there’s no electrical current, just numbers; no physical circuitry, just rules for how the numbers change. But the result is that pressing a button connected to the MOS 6502 emulation produces the exact same pictures on the screen as pushing the “shoot” button on the old Atari joystick did. The physical MOS 6502 is no longer there, but the information pattern is the same, so identical inputs (button presses) produce identical outputs (images on the screen).

If you want to try this for yourself, head over to NESbox, where you can play thousands of old games like Super Mario World right there in your browser. These games weren’t rewritten to work in your browser: Instead, the original hardware (like 1991’s Super Nintendo) is emulated exactly in your browser, and thus the same inputs (the game file plus your button presses) result in the same outputs (moving images and sound).

In theory, it should be possible to do a similar thing with a human brain. We could simulate the brain’s physical, chemical, and electrical structure in such detail that we could run the brain on a computer, producing similar “outputs” (instructions to virtual limbs, mouth, and other organs) as a real brain would. But in practice, we’re lacking three major things. First, we can’t yet scan a human brain in nearly enough detail to map all its “circuits.” Second, we don’t understand the rules governing how neurons and other brain cells work nearly as well as we understand how computer circuits work. And third, we don’t have enough computing power to run the emulation, since the human brain is vastly more complicated than the MOS 6502 processor.

We’re still pretty far away from meeting any of these conditions. Today scientists can’t even emulate the brain of a tiny worm called C. elegans, which has 302 neurons, compared with the human brain’s 86 billion neurons. Using models of expected technological progress on the three key problems, we’d estimate that we wouldn’t be able to emulate human brains until at least 2070 (though this estimate is very uncertain).

But would an emulation of your brain be you, and would it be conscious? Such questions quickly get us into thorny philosophical territory, so we’ll sidestep them for now. For many purposes—estimating the economic impact of brain emulations, for instance—it suffices to know that the brain emulations would have humanlike functionality, regardless of whether the brain emulation would also be conscious.

So, how scientifically realistic is Transcendence? From the trailers and the original screenplay, it seems likely to be wrong about many of the details, but it might be right about the eventual technological feasibility of whole brain emulation. But we must remember this isn’t “settled science”—it’s more akin to the “exploratory engineering” of pre-Sputnik astronautics, pre-ENIAC computer science, and contemporary research in molecular nanotechnology, interstellar travel, and quantum computing. Science fiction often becomes science fact, but sometimes it does not. The feasibility of whole brain emulation is an open question, and will likely remain so for decades.

As for Johnny Depp, he was born in 1963, and by 2070 he’d be 107 years old. In the United States, life expectancy for males is just below 79 years. By our estimate, Jack Sparrow is unlikely to be digitally preserved so that he can star in Pirates of the Caribbean 16: Digital Sparrow. Whether that is a tragedy or a relief is a question beyond the scope of this piece.

This article is part of Future Tense, a collaboration among Arizona State University, the New America Foundation, and SlateFuture Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.