Future Tense

What Will Come After the Computer Chip? Nature-Inspired Computing.

Intel co-founder Gordon Moore famously wrote about how the number of transistors on silicon chips would double roughly every two years—an observation now known as Moore’s Law. But even as Intel pushes into nanotechnology, computing is now reaching the limits of that law. On Thursday, March 21, former Intel CEO Craig R. Barrett and Arizona State University President Michael Crow will be at the Phoenix Art Museum to answer the question, “What comes after the computer chip?” Ahead of the event, which is being hosted by Zócalo Public Square, we’ll be publishing a series of blog posts in which experts weigh in. For more information, visit the Zócalo Public Square website. (Zócalo Public Square is a partnership of the New America Foundation and Arizona State University; Future Tense is a partnership of Slate, New America, and ASU.)

We are rapidly reaching the end of the doubling of transistor density every two years described by Moore’s law, as we are literally running out of atoms with which to make individual transistors. Recently, nanotechnology has led to many new and exciting materials—such as semiconductor nanowires, graphene, and carbon nanotubes. But as long as computing is based on digital logic (ones or zeros) moving electronic charge around to turn on and off individual transistors, these new materials will only extend Moore’s law two or three more generations.  The fundamental size limits still exist, not to mention limitations due to heat generation. Some new paradigms of non-charge-based computing may emerge that for example, could theoretically use the spin of an electron or nuclei to store or encode information. However, there are many obstacles to creating a viable, scalable technology based on “spintronics” that can keep us on the path of Moore’s law.

It’s important to remember, though, that Moore’s law can be viewed not merely as a doubling of density of transistors every two years, but as a doubling of information processing capability as well. While bare number-crunching operations are most efficiently performed using digital logic, new developments in digital imagery, video, speech recognition, artificial intelligence, etc., require processing vast amounts of data. Nature has much to teach us in terms of how we can efficiently process vast amounts of sensory information in a highly parallel, analog fashion like the brain does, which is fundamentally different than conventional digital computation. Such “neuromorphic” computing systems, which mimic neural-biological functions, may be more efficiently realized with new materials and devices that are not presently on the radar screen. Similarly, quantum computing may offer a way of addressing specialized problems involving large amounts of parallel information processing. The most likely scenario is that the computer chip of the future will marry a version of our current digital technology to highly parallel, specialized architectures inspired by biological systems, with each performing what it does best. New computational paradigms and architectures together with improved materials and device technologies will likely allow a continued doubling of our information processing capability long after we reach the limits of scaling of conventional transistors.

More answers to the question “What comes after the computer chip?”

Biomedical breakthroughs, says Stanford’s H.-S. Philip Wong
The end of the “La-Z-Boy era” of sequential programming, writes Konstantin Kakaes Better brain-computer interfaces, writes Sethuraman Panchanathan