A couple months after the turn of the millennium, Utah Sen. Bob Bennett took to the floor of the Senate and declared victory over Y2K. "The record is fairly clear that had we as a nation not focused on this issue and dealt with it, we would have had very significant problems," he said. But by then, virtually nobody was listening, and Bennett, who chaired the Senate's Special Committee on the Year 2000 Technology Problem, pretty much knew it.
The clocks had turned over on Jan. 1, 2000, with few major disruptions, and cable news chatterers quickly concluded that we'd been duped. Y2K and the years of doomsaying it had inspired now looked like a fin de siècle affectation, the sort of problem people invent when the economy's booming and they've got nothing else to worry about. After a short round of thank-yous to his staff, Bennett closed down the special committee—and with that, the federal government squashed its involvement in the millennium bug.
Almost 10 years later, it's remarkable how little we think about Y2K. Today and tomorrow, I'm going to do my best to fix that oversight. In the first half of my two-part Y2K retrospective, I'll try to evaluate whether our millennial preparations were a good idea or a huge waste. On Thursday, I'll look at the lessons Y2K provides when it comes to planning for future disasters.
How big a deal was Y2K? In the run-up to new century, the United States spent about $100 billion combating the bug—around $9 billion by the federal government, and the rest by utility companies, banks, airlines, telecommunications firms, and just about every other corporate entity with more than a few computers. The rest of the world was no slouch, either; estimates for global Y2K-readiness spending range from about $300 billion to $500 billion.
Yet despite all that spending, the world quickly forgot about it. The Senate Committee's final report (PDF) avoids any deep inquiry into whether the money was well-spent, and no other government, private, or academic agency has since looked into the bug. It's hard to avoid the conclusion that we're all a little embarrassed about the whole thing. Just about everyone who'd been worried about Y2K before Jan. 1, 2000, slouched away in shame afterward, less interested in assessing what went right and what went wrong than in distancing themselves from a perceived boondoggle.
That's unfortunate. Our response to Y2K is remembered as an overreaction—and there's probably a good case to be made that some of what we spent wasn't necessary. But that's not the only way to look at Y2K. The computer bug reshaped the tech industry, and the rest of corporate America, in lasting ways. Y2K helped bring tech managers to greater prominence within their organizations, and it arguably sparked the boom in tech outsourcing.
What's more, it's the only recent example of something exceedingly rare in America—an occasion when we spent massive amounts of time and money to improve national infrastructure to prevent a disaster. Typically, we write checks and make plans after a catastrophe has taken place, as we did for 9/11 and Hurricane Katrina. Y2K, by contrast, was a heroic feat of logistical planning; within just a couple years, small and large companies were able to completely review and fix computer code that had been kicking around in their systems for decades. Some experts argue that the systems built in preparation for Y2K helped New York's communication infrastructure stay up during the terrorist attacks a year and a half later. The 9/11 Commission Report says that the Y2K threat spurred a round of information sharing within the government unlike any other in recent times. The last few weeks of December 1999 were "the one period in which the government as a whole seemed to be acting in concert," the commission reported. It added: "After the millennium alert, the government relaxed."
Y2K was a simple bug. Computers had long represented years as two-digit numbers as opposed to four—that is, 99 instead of 1999—and experts predicted all kinds of misfortune as the year switched to 00 and computers were forced to puzzle out what year it really was. This would seem an easy problem to fix: Just patch the software! But that meant tackling the real problem that Y2K posed for all organizations—cataloging all of the different, overlapping computer systems within a company and devising ways to fix each and every one of them. If all these systems weren't fixed in time, experts feared that utilities would shut down, the airlines would be grounded, Social Security checks would be delayed, and the IRS would lose our tax records. We also heard that we'd be screwed even if we fixed the problem: In a cover story, BusinessWeekreported that diverting resources to staving off the Y2K bug would depress the economy as much as the East Asian financial crisis.
Before Y2K, IT managers were often shunted off to dark corners, and many firms had no general list of all the technology that affected their operations. "Sometimes there were hundreds of different ways that dates were stored and processed within companies," says Leon Kappelman, a professor of information systems at the University of North Texas who worked on several technical committees in preparation for Y2K.
The bug changed all that. For the first time, top executives had to defer to tech people, who were called upon to take on management duties in companies—to find all the systems vulnerable to Y2K and look for the cheapest ways to solve them. But the American tech industry—preoccupied with the billions that could be made on the emerging Web—couldn't easily satisfy the demand for programmers. The economists Devashish Mitra and Priya Ranjan argue that the search for cheap coders led American firms to India, which had legions of programmers who'd long been trying to get a foot in the American economy. Indian outsourcing firms—including Infosys, Wipro, and TCS—booked billions in business from American companies looking to fix their Y2K woes.
But here's the interesting part: After Y2K was over, American companies retained their taste for Indian programmers. "Outsourcing kept increasing well after the Y2K problem became a thing of the past," Mitra and Ranjan write. In this way, Y2K has parallels to the oil shortages of the 1970s, which helped popularize Japanese cars—a classic example of a temporary economic shock that produces a permanent change. Mitra and Ranjan, like many economists, are in favor of outsourcing and see Y2K as increasing the net benefit to American firms; people wary of the rise of the Indian IT industry (like many American programmers who are understandably worried about their job security) might feel otherwise.
Was it all worth it? All these years later, it's difficult to determine if every single cent spent on Y2K was really necessary, especially considering that the government has been so reluctant to go back and look. "Sure, things were replaced that didn't need to be replaced, and probably some people used it as an opportunity to upgrade systems when the better option would have been repair what they had," Kappelman says. Still, he estimates that 80 percent to 90 percent of the money spent on Y2K was right on target. Who's to say whether that guess is accurate?
Y2K detractors often argue that other countries spent far less to prevent disaster and didn't see any more problems than we did. Aside from a few isolated reports of power outages and dead phone lines, there were few major problems anywhere in the world. Doesn't that just mean we'd have been OK if we'd simply done nothing to prepare? Not really. First, America's tech infrastructure was bigger and more complex than that of other countries—in other words, there was a much greater chance that something could have gone catastrophically wrong here. American utility companies spent hundreds of millions working on the problem, and in testimony before the Senate, several reported that the measures they put in place prevented widespread failures. What's more, it isn't even really true that other countries skimped on Y2K; the Senate committee cites experts who noted that per capita, the United Kingdom, Canada, Denmark, and the Netherlands spent about as much combating Y2K as the United States did.
Indeed, looking back at the record, this remains one of the most interesting facts about Y2K—the whole world worked together to prevent an expensive problem. When people first became aware of the computer bug in the early 1990s, Y2K was easy to dismiss—it was a far-off threat whose importance was a matter of dispute, and which would clearly cost a lot to fix. Many of our thorniest problems share these features: global warming, health care policy, the federal budget, disaster preparedness. So what made Y2K different? How did we manage to do something about it, and can we replicate that success for other potential catastrophes? For answers, stay tuned for Part 2.
Farhad Manjoo is Slate's technology columnist and the author of True Enough: Learning To Live in a Post-Fact Society. You can email him at email@example.com and follow him on Twitter.