Books

When America First Met the Microchip

At the time, few knew it would change the world.

The following article is adapted from Chapter 10 of Fred Kaplan’s new book, 1959: The Year Everything Changed.To learn how he came to write about that year, see “ 1959: I swear, it really is the year everything changed.”

On March 24, 1959, at the Institute of Radio Engineers’ annual trade show in the New York Coliseum, Texas Instruments, one of the nation’s leading electronics firms, introduced a new device that would change the world as profoundly as any invention of the 20th century—the solid integrated circuit, or, as it came to be called, the microchip.

Without the chip, the commonplace conveniences of modern life—personal computers, the Internet, anything involving digital technology and displays, even something as simple as the handheld calculator—would be the stuff of science fiction.

Jack Kilby

But few detected the invention’s significance at the time. It was an era of wondrous technological advances—rockets, jet passenger planes, computers, and seemingly magical pills that altered human chemistry. Who could tell whether some new gizmo—one of dozens in development—would be a transformation or a fizzle?

A story in the New York Times about that year’s trade show highlighted three new inventions on display. The integrated circuit was one of them, but it was mentioned last and took up just two paragraphs. The bulk of the story was devoted to a radar system designed by Westinghouse that would let motorists drive coast to coast with their hands off the steering wheel. Thin foil strips, coded in dots and dashes, would line the nation’s highways. Transmitter-receivers, placed on every car’s front bumper, would decode the strips, signaling the steering wheel to go straight or to turn. On paper, at a time when the interstate highway system was still in the early stages of construction, the idea seemed appealing and very futuristic. In the real world, of course, it went nowhere.

Another much-touted marvel that year was “missile mail.” On June 8, 1959, at 9:10 a.m., the submarine USSBarbero surfaced nearly 100 miles off the Florida coast and fired a Regulus I guided cruise missile toward the shore. Fitted with retractable landing gear, so it could be recovered and reused for testing, the missile touched down at the Mayport Naval Auxiliary Air Station, near Jacksonville, 21 minutes later.

Packed inside the missile’s nose cone were two small metal boxes containing 3,000 envelopes, each stamped with a logo that read: “First Official Missile Mail.”

Inside each envelope was a letter, addressed to officials from President Dwight Eisenhower on down, in which the U.S. postmaster general, Arthur Summerfield, hailed the achievement as “an historic milestone” in speeding “communications between the peoples of the earth.” Speaking to reporters after the flight, he proclaimed, “I believe we will see missile mail developed to a significant degree before man has reached the moon.”

The dream turned out to be no more than that. The Navy wasn’t interested in the project as anything more than a one-time PR gimmick. The costs were too high, the benefits too meager.

The microchip might have gone the same way. Its inventor, Jack St. Clair Kilby, and his boss, Texas Instruments’ president, Patrick Haggerty, understood its significance. At a press conference, held in the New York Athletic Club on Central Park South, around the corner from the Coliseum, Hagerty said the device’s greatest potential lay in the rapidly growing fields of computers, rockets, missiles, satellites, and space-vehicle instrumentation, where weight, size, and reliability were critical. But he added, with remarkable prescience, that it might also revolutionize telephones, televisions, radios, radar, hearing aids, medical instruments—anything and everything involving automation.

Still, in the beginning, the chips were very expensive. To make a dent in the marketplace, they would have to be much cheaper; but to be much cheaper, they would have to make a big dent in the marketplace—there would have to be high demand so that they could be produced in mass quantity.

That wouldn’t happen until the beginning of the ‘60s, when President John F. Kennedy ordered production of the Minuteman II missile—which required tiny, reliable circuits for its guidance system—and, especially, when he declared his goal of landing a man on the moon by the end of the decade.

It was government that created the large demand that facilitated mass production of the microchip. (This isn’t a universal principle. The birth-control pill, another wonder of 1959, was financed entirely by private philanthropists—feminist crusaders Margaret Sanger and Katharine McCormick—and, when it hit the market, popular demand was instant and enormous.) In 1961, a single chip cost $32. By 1971, thanks to the economies of large-scale production, the cost had plunged to $1.25. By 2000, after the consumer market had vastly expanded, the price of a much more powerful chip would be less than a nickel.

As with many of the breakthroughs converging on the eve of the ‘60s, the space race and the arms race—which held out the twin prospects of infinite expansion and instant annihilation—spurred America and the world into a lightning-flash new era.