What Will Come After the Computer Chip? The End of the "La-Z-Boy" Programming Era.

Future Tense
The Citizen's Guide to the Future
March 18 2013 12:31 PM

What Will Come After the Computer Chip? The End of the "La-Z-Boy" Programming Era.

Intel co-founder Gordon Moore famously wrote about how the number of transistors on silicon chips would double roughly every two years—an observation now known as Moore’s Law. But even as Intel pushes into nanotechnology, computing is now reaching the limits of that law. On Thursday, March 21, former Intel CEO Craig R. Barrett and Arizona State University President Michael Crow will be at the Phoenix Art Museum to answer the question, “What comes after the computer chip?” Ahead of the event, which is being hosted by Zócalo Public Square, we’ll be publishing a series of blog posts in which experts weigh in. For more information, visit the Zócalo Public Square website. (Zócalo Public Square is a partnership of the New America Foundation and Arizona State University; Future Tense is a partnership of Slate, New America, and ASU.)

The important question to the end-user is not what comes after the chip, but how chips can be designed and integrated with sufficient ingenuity so that processing speed improves even as physics constrains the speed and size of circuits.

Advertisement

Ever since John von Neumann first enunciated the architecture of the modern computer in 1945, processors and memory have gotten faster more quickly than the ability to communicate between them, leading to an ever-worsening “von Neumann bottleneck”—the connection between memory and a CPU (or central processing unit).

Because chip features can no longer simply be made smaller, the only way forward is through increasing parallelism—doing many computations at once instead of, as in a classic von Neumann architecture, one computation at a time. (Each computation is essentially a logical operation like “AND” and “OR” executed in the correct order by hardware—it’s the basis for how a computer functions.)

Though the first multiprocessor architecture debuted in 1961, the practice didn’t become mainstream until the mid-‘00s, when chip companies started placing multiple processing units or “cores” on the same microprocessor. Chips often have two or four cores today. Within a decade, a chip could have hundreds or even thousands of cores. A laptop or mobile device might have one chip with many cores, while supercomputers will be comprised (as they are today) of many such chips in parallel, so that a single computer will have as many as a billion processors before the end of the decade, according to Peter Ungaro, the head of supercomputing company Cray.

Figuring out how best to interconnect both many cores on a single chip and many chips to one another is a major challenge. So is how to move a computation forward when it is no long possible to synchronize all of a chip’s processors with a signal from a central clock, as is done today. New solutions like “transactional memory” will allow different processes to efficiently share memory without introducing errors.

The overall problem is so difficult because the hardware is only as good as the software, and the software only as good as the hardware. One way around this chicken and egg problem will be “autotuning” systems that will replace traditional compilers. Compilers translate a program in a high-level language into a specific set of low-level instructions. Autotuning will instead try out lots of different possible translations of a high level program to see which works best.

Autotuning and transactional memory are just two of many new techniques being developed by computer scientists to take advantage of parallelism. There is no question the new techniques are harder for programmers. One group at Berkeley calls it the end of the “La-Z-Boy era” of sequential programming.

More answers to the question "What comes after the computer chip?"

Future Tense is a partnership of SlateNew America, and Arizona State University.

Konstantin Kakaes is a Schwartz fellow at the New America Foundation and the author of the e-book The Pioneer Detectives: Did a Distant Spacecraft Prove Einstein and Newton Wrong? Follow him on Twitter.

TODAY IN SLATE

The World

The Budget Disaster that Sabotaged the WHO’s Response to Ebola

Are the Attacks in Canada a Sign of ISIS on the Rise in the West?

PowerPoint Is the Worst, and Now It’s the Latest Way to Hack Into Your Computer

Is It Offensive When Kids Use Bad Words for Good Causes?

Fascinating Maps Based on Reddit, Craigslist, and OkCupid Data

Culturebox

The Real Secret of Serial

What reporter Sarah Koenig actually believes.

Culturebox

The Actual World

“Mount Thoreau” and the naming of things in the wilderness.

In Praise of 13th Grade: Why a Fifth Year of High School Is a Great Idea

Can Democratic Sen. Mary Landrieu Pull Off One More Louisiana Miracle?

  News & Politics
Politics
Oct. 23 2014 3:55 PM Panda Sluggers Democrats are in trouble. Time to bash China.
  Business
Business Insider
Oct. 23 2014 2:36 PM Take a Rare Peek Inside the Massive Data Centers That Power Google
  Life
Outward
Oct. 23 2014 5:08 PM Why Is an Obscure 1968 Documentary in the Opening Credits of Transparent?
  Double X
The XX Factor
Oct. 23 2014 11:33 AM Watch Little Princesses Curse for the Feminist Cause
  Slate Plus
Working
Oct. 23 2014 11:28 AM Slate’s Working Podcast: Episode 2 Transcript Read what David Plotz asked Dr. Meri Kolbrener about her workday.
  Arts
Brow Beat
Oct. 23 2014 5:08 PM What Happens When You Serve McDonald’s to Food Snobs and Tell Them It’s Organic
  Technology
Technology
Oct. 23 2014 4:36 PM Vampire Porn Mindgeek is a cautionary tale of consolidating production and distribution in a single, monopolistic owner.
  Health & Science
Bad Astronomy
Oct. 23 2014 7:30 AM Our Solar System and Galaxy … Seen by an Astronaut
  Sports
Sports Nut
Oct. 20 2014 5:09 PM Keepaway, on Three. Ready—Break! On his record-breaking touchdown pass, Peyton Manning couldn’t even leave the celebration to chance.