Future Tense

What Comes After the Computer Chip? More Mind-Blowing Biomedical Breakthroughs.

Intel co-founder Gordon Moore famously wrote about how the number of transistors on silicon chips would double roughly every two years—an observation now known as Moore’s Law. But even as Intel pushes into nanotechnology, computing is now reaching the limits of that law. On Thursday, March 21, former Intel CEO Craig R. Barrett and Arizona State University President Michael Crow will be at the Phoenix Art Museum to answer the question, “What comes after the computer chip?” Ahead of the event, which is being hosted by Zócalo Public Square, we’ll be publishing a series of blog posts in which experts weigh in. For more information, visit the Zócalo Public Square website. (Zócalo Public Square is a partnership of the New America Foundation and Arizona State University; Future Tense is a partnership of Slate, New America, and ASU.)

The 10 fingers, the abacus, mechanical cash registers, vacuum tube-based ENIAC, the transistor, the integrated circuit, the billion-transistor “computer chip”… then what? I suppose that was the line of thinking when this question was posed. Rather than fixating on whether a new “transistor” or a new “integrated circuit” will be invented, it is useful to focus on two key observations: “It will be a long time before we reach the fundamental limits of computing,” and “the technologies we use to build the computer chip will impact many fields outside of computing.”

Advances in computing are reined in by energy consumption of the computer chip. Today’s transistor consumes in excess of 1,000 times more energy than the kT∙ ln(2) limit for erasing one bit of information per logical step of computing. Reversible computing, as described by Rolf Landauer and Charles Bennett, will reach below the kT∙ ln(2) limit once a practical implementation is devised. There is plenty of room at the bottom! We will continue to get more computational power for lesser amount of energy consumed.

Now that I have put to rest the inkling that there may be an end to the rapid progress we expect from the computer chip, let’s talk about what else the “computer chip” will bring us in addition to computing and information technology. The semiconductor technology and design methodology that are employed to fabricate the computer chip have already wielded its power in other fields. Tiny cameras in cellphones that allow us to take pictures wherever we go, digitally projected 3-D movies, and LED lighting that is substantially more energy efficient than the incandescent light bulb are all examples of “computer chip”-technologies that have already made impact in society. Enabling technologies that transform the field of biomedical research are in the offing. The cost for sequencing a genome has dropped faster than Moore’s Law; the technique is based on technologies borrowed from computer chip manufacturing. Nanofabrication techniques developed for the semiconductor industry have enabled massive probing of neural signals, which eventually will lead to a sea change in our understanding of neuroscience. Nanofabricated sensors and actuators, in the style of Fantastic Voyage, are now beginning to be developed and are not completely science fiction. Emulation of the brain, both by brute force supercomputers or innovative nanoscale electronic devices, is becoming possible and will reach human-scale if the present rate of progress continues.

I am optimistic that what we have experienced in technological progress so far is just the beginning. The societal impact of the “computer chip” and the basic technologies that are the foundations of the “computer chip” will advance knowledge in other fields.

More answers to the question “What comes after the computer chip?”

Better brain-computer interfaces, writes Sethuraman Panchanathan
Nature-inspired computing, according to Stephen Goodnick.
The end of the “La-Z-Boy era” of sequential programming, writes Konstantin Kakaes