The End of Moore's Law
Microchips are getting smaller—and that's the problem.
Until recently, Moore's Law, the observation that the number of transistors on a microchip doubles every 18 months to two years, seemed a self-fulfilling prophecy. When Intel co-founder Gordon Moore issued his famous prediction 40 years ago, a chip could hold a few dozen transistors. Today, Intel can cram almost 1 billion transistors, each of which is less than 100 nanometers in size, on a single microchip. (One nanometer is 1 millionth of a millimeter—the equivalent of about 10 hydrogen atoms.) The transistors on Intel's chips are so tiny that they're not visible to the naked eye. *
Given the state of today's technology, chips can only get so small. When Intel churned out a 90-nanometer chip called "Prescott" last year, it went from pushing the boundaries of miniaturization to the realm of nanotechnology. Unfortunately for the chipmakers, this level of shrinkage has side effects. Not only was Prescott slower than its predecessor, it generated more heat—the mortal enemy of laptop motherboards. The smaller the chip, the hotter they run. The heat created by so many transistors stuffed onto a tiny sliver of silicon has pushed the thermal conductivity of the copper interconnects to their limit. When they overheat, the interconnects can fail.
Since it has hit a "thermal wall," Intel can't continue to shrink its chips to increase speed and computing power anymore. And it isn't alone. The International Technology Roadmap for Semiconductors, which reflects the consensus of the major microchip trade organizations in the United States, Europe, and Asia, reports that the interconnect issue is a huge stumbling block. Unless chip manufacturers figure out some new techniques, the march to miniaturization could stall.
You might think, so what? Who needs a toenail-sized laptop anyway? Those of us who don't use computers to decode data from the Human Genome Project or design rocket ships have more than enough processing power at our disposal. Besides, as anyone who has used a desktop computer knows, these spiffy microchips don't often get to strut their stuff. The latest edition of Microsoft Word, for instance, probably works no faster than the version you used a decade ago.
If you are a researcher crunching DNA data or an astrophysicist charting the galaxy, however, you probably want—and need—faster, more powerful microchips. So do businesses. Consider the increasingly ubiquitous RFID tags used to track products and people in real-time. All of those tags yield tremendous amounts of data that must be stored somewhere. A product on the move—trucked from the factory to a warehouse to a store shelf to the cash register—can require millions of calculations. Multiply that by the number of products available in a typical Wal-Mart location and it would require some serious computing power.
Intel CEO Craig Barrett has said, somewhat self-servingly, that microchips are "the most complicated things that human beings have ever built." Let's assume that, at the very least, they're really, really complicated. Now, consider having to fabricate chips using completely unproven manufacturing methods. Retooling for nanotechnology will cost the industry billions, and there is no guarantee it will pan out.
Perhaps the most promising research involves carbon nanotubes, which are 20 times stronger than steel and conduct electricity 1,000 times better than copper. The nanotubes also don't suffer from copper's heat-leakage problems. But working with them is exceedingly difficult because—you guessed it—they are so damn small. The primary strategy these days is to grow carbon atoms on a chip by taking a catalyst like iron and flowing hydrocarbon gas over it. The problem with this "direct growth" technique, though, is that it requires extreme heat, on the order of 800 degrees Celsius. Those kinds of temperatures would make mass production a challenging and potentially dangerous proposition.
Another technique, spearheaded by a Duke University chemist, involves placing nanotubes in a solution and basically gluing them to the microchip. A third, practiced by engineers from Cambridge University and Samsung, uses lithography to seed silicon wafers and create "nanoelectromechanical system switches." Finally, Nantero, a Massachusetts-based startup, has created a prototype consisting of carbon cylinders that lay on ridges "like planks on sawhorses." Electrical pulses can mold these tubes into positions that represent the zeros and ones of digital data. The company promises that billions of bits of data could be stored on a chip the size of a dress-shirt button, allowing battery-thrifty portable devices to play music and video for days on end and PCs to boot up without delay.
Will any of this stuff work? Your guess is as good as mine.
In the meantime, the semiconductor industry is hedging its bets. The latest trend in the microchip biz is the "dual-core" processor—two separate units with two memory caches melded into one piece of silicon. In a lot of ways, this is preferable to the faster, smaller chip designs of the recent past. Instead of pushing the speed limit on each single processor, chips with two, four, or even eight cores will ensure that no single processor has to work too hard or fast. For one thing, that helps put a chill on the excess heat problem. It also increases the chip's throughput: One core can transmit instructions while another crunches numbers and a third accesses memory. This is not unlike Google networking 170,000 computers together to process search requests instead of relying on one or two supercomputers.