The United States of Inequality

Did Computers Create Inequality?
What's causing America's growing income inequality?
Sept. 8 2010 11:07 PM

The United States of Inequality


ENIAC computer
ENIAC computer, late 1940s or early 1950s

"What you earn," Bill Clinton said more than once when he was president, "is a function of what you can learn." That had always been true, but Clinton's point was that at the close of the 20th century it was becoming more true, because computers were transforming the marketplace. A manufacturing-based economy was giving way to a knowledge-based economy that had an upper class and a lower class but not much of a middle class.

The top was occupied by a group that Clinton's first labor secretary, Robert Reich, labeled "symbolic analysts."   These were people who "simplify reality into abstract images that can be rearranged, juggled, or experimented with" using "mathematical algorithms, legal arguments, financial gimmicks, scientific principles, psychological insights," and other tools seldom acquired without a college or graduate degree. At the bottom were providers of "in-person services" like waitressing, home health care, and security. The middle, once occupied by factory workers, stenographers, and other moderately skilled laborers, was disappearing fast.

Did computerization create the Great Divergence?

Our story begins in the 1950s, at the dawn of the computer age, when homo sapiens first began to worry that automation would bring about mass unemployment. Economic theory dating back to the 19th century said this couldn't happen, because the number of jobs isn't fixed; a new machine might eliminate jobs in one part of the economy, but it would also create jobs in another part. For example, someone had to be employed to make these new machines. But as the economists Frank Levy of MIT and Richard J. Murnane of Harvard have noted, computers represented an entirely different sort of new machine. Previously, technology had performed physical tasks. (Think of John Henry's nemesis, the steam-powered hammer.) Computers were designed to perform cognitive tasks. (Think of Garry * Kasparov's nemesis, IBM's Deep Blue.) Theoretically, there was no limit to the kinds of work computers might eventually perform. In 1964 several eminent Americans, including past and future Nobel laureates Linus Pauling and Gunnar Myrdal, wrote President Lyndon Johnson to warn him about "a system of almost unlimited productive capacity which requires progressively less human labor."

Such a dystopia may yet one day emerge. But thus far traditional economic theory is holding up reasonably well. Computers are eliminating jobs, but they're also creating jobs. The trouble, Levy and Murnane argue, is that the kinds of jobs computers tend to eliminate are those that require some thinking but not a lot—precisely the niche previously occupied by moderately skilled middle-class laborers.

Slide Show: The Great Divergence In Pictures. Click image to launch.

Consider the sad tale of the bank teller. When is the last time you saw one? In the 1970s, the number of bank tellers grew by more than 85 percent. It was one of the nation's fastest-growing occupations, and it required only a high school degree. In 1970, bank tellers averaged about $90 a week, which in 2010 dollars translates into an annual wage of about $26,000. But over the last 30 years, people pretty much stopped ever stepping into the lobby of their bank; instead, they started using the automatic teller machine outside and eventually learned to manage their accounts from their personal computers or mobile phones.

Today, the job category "bank teller" is one of the nation's slowest-growing occupations. The Bureau of Labor Statistics projects a paltry 6 percent growth rate during the next decade. The job now pays slightly less than it did in 1970, averaging about $25,000 a year.

As this story plays out in similar occupations—cashiers, typists, welders, farmers, appliance repairmen (this last already so obsolete that no one bothers to substitute a plausible ungendered noun)—the moderately skilled workforce is hollowing out. This trend isn't unique to the United States. The Japanese have a word for it: kudoka. David Autor, an MIT economist, calls it "job polarization," and he has demonstrated that it's happening to roughly the same extent within the European Union as it is in the United States. But Autor readily concedes that computer-driven job polarization can't possibly explain the entire trend toward income inequality in the United States, because income inequality is much greater in the United States than it is in Europe.

Another problem that arises when you try to attribute the income-inequality trend to computers is that the Great Divergence began in the late 1970s (see Figure 2), well before most people had ever seen a personal computer. By the late 1990s, as businesses stampeded to the Internet, inequality slackened a bit. If computers were the only factor driving inequality, or even the main factor, the opposite should have happened.   A final problem is that the income premium for college or graduate-level education gradually slackens off at higher incomes, even as income inequality intensifies. If computers required ever-higher levels of education to manipulate ever-growing quantities of information in ever-more rococo ways, then we'd expect the very richest people to be the biggest nerds. They aren't.

Here, then, is a dilemma. We know that computers put a premium on more highly educated workers, but we can't really demonstrate that computers caused the Great Divergence. What is it that's so special about computers? Harvard economists Claudia Goldin and Lawrence Katz offer an interesting answer: Nothing!

Yes, Goldin and Katz argue, computer technology had a big impact on the economy. But that impact was no larger than that of other technologies introduced throughout the 20th century, starting in 1900 with the dynamo that Henry Adams famously swooned over at the Paris Exposition. Between 1909 and 1929, Katz and Goldin report in their 2008 book, The Race Between Education and Technology, the percentage of manufacturing horsepower acquired through the purchase of electricity rose sixfold. From 1917 to 1930, the proportion of U.S. homes with electricity increased from 24 percent to 80 percent. By contrast, from 1984 to 2003, the proportion of U.S. workers using computers increased from 25 percent to 57 percent. Computer use has spread quickly, but not as quickly as electric power did during the early part of the 20th century. "Skill-biased technological change is not new," Katz and Goldin wrote in a 2009 paper, "and it did not greatly accelerate toward the end of the twentieth century."

Contemporary culture is so fixated on the computer revolution that the very word "technology" has become an informal synonym for "computers." But before computers we witnessed technological revolutions brought on by the advent of the automobile, the airplane, radio, television, the washing machine, the Xerox machine, and too many other devices to name. Most of these earlier inventions had much the same effect as the computer—that is, they increased demand for progressively higher-skilled workers. But (with the possible exception of radio) none of these consumer innovations coincided with an increase in inequality. Why not? Katz and Goldin have a persuasive answer that we'll consider later in this series.

View a visual guide to inequality. Subscribe to this series. Navigate the series from a single page.

Correction, Sept. 9, 2010:An earlier version of this story misstated Kasparov's first name as "Boris." (Return to the corrected sentence.)

Like  Slate on Facebook. Follow us on Twitter.



Slate Plus Early Read: The Self-Made Man

The story of America’s most pliable, pernicious, irrepressible myth.

Rehtaeh Parsons Was the Most Famous Victim in Canada. Now, Journalists Can’t Even Say Her Name.

Mitt Romney May Be Weighing a 2016 Run. That Would Be a Big Mistake.

Amazing Photos From Hong Kong’s Umbrella Revolution

Transparent Is the Fall’s Only Great New Show

The XX Factor

Rehtaeh Parsons Was the Most Famous Victim in Canada

Now, journalists can't even say her name.


Lena Dunham, the Book

More shtick than honesty in Not That Kind of Girl.

What a Juicy New Book About Diane Sawyer and Katie Couric Fails to Tell Us About the TV News Business

Does Your Child Have Sluggish Cognitive Tempo? Or Is That Just a Disorder Made Up to Scare You?

  News & Politics
Damned Spot
Sept. 30 2014 9:00 AM Now Stare. Don’t Stop. The perfect political wife’s loving gaze in campaign ads.
Sept. 29 2014 7:01 PM We May Never Know If Larry Ellison Flew a Fighter Jet Under the Golden Gate Bridge
Atlas Obscura
Sept. 30 2014 10:10 AM A Lovable Murderer and Heroic Villain: The Story of Australia's Most Iconic Outlaw
  Double X
Sept. 29 2014 11:43 PM Lena Dunham, the Book More shtick than honesty in Not That Kind of Girl.
  Slate Plus
Slate Fare
Sept. 29 2014 8:45 AM Slate Isn’t Too Liberal. But… What readers said about the magazine’s bias and balance.
Brow Beat
Sept. 29 2014 9:06 PM Paul Thomas Anderson’s Inherent Vice Looks Like a Comic Masterpiece
Future Tense
Sept. 30 2014 7:36 AM Almost Humane What sci-fi can teach us about our treatment of prisoners of war.
  Health & Science
Bad Astronomy
Sept. 30 2014 7:30 AM What Lurks Beneath The Methane Lakes of Titan?
Sports Nut
Sept. 28 2014 8:30 PM NFL Players Die Young. Or Maybe They Live Long Lives. Why it’s so hard to pin down the effects of football on players’ lives.