There’s a disturbance in the force of the U.S. economy. An airline canceled flights because it couldn’t find enough pilots to steer them. Despite high demand, homebuilders in Colorado are throttling back activity because they can’t find the workers to erect frames. Farmers in Alabama are fretting that crops may rot in the ground for a lack of workers to bring in the harvest.
This state of things has left analysts stumped. For nearly a decade, the Federal Reserve has kept interest rates at extraordinarily low levels in order to initiate growth and rising demand, inflation, and ultimately higher wages. But the higher wages have been stubbornly slow to materialize. Americans are working—but they’re not making more. Which really means that they’re making less. Average hourly earnings have risen just 2.5 percent in the past 12 months.
Want to listen to this article out loud? Hear it on Slate Voice.
Get Slate Voice, the spoken edition of the magazine. In addition to this article, you’ll hear a daily selection of our best stories, handpicked by our editors and voiced by professional narrators.
To listen to an audio recording of this article, copy this link and add it to your podcast app:
For full instructions see the Slate Plus podcasts FAQ.
By many measures, workers are in a good position to rate higher pay: a record 82 straight months of jobs growth, an unemployment rate of 4.4 percent, and a record 146.4 million Americans with payroll jobs. There are a whopping 5.7 million job openings (well over twice the level of eight years ago). Meanwhile, baby boomers are aging out of the workforce at a rapid clip and Mexicans, many of whom crossed the border to work, have been leaving the U.S. for years. The demand for workers is high.
Given these conditions, wages should be rising sharply. But look at this chart from the Atlanta Federal Reserve: They haven’t been, and they’re not. Every month when the government releases its latest employment data, newspapers call up small business or large companies, usually in the Midwest or Sun Belt, who testify to their frustration. Last week, the New York Times featured a Columbus, Ohio, cleaning company owner mystified that he couldn’t find applicants for his $9.25-per-hour jobs (“I sometimes wish there was actually a higher unemployment rate,” he actually said) and a Nebraska roofer who couldn’t figure out why nobody applied for the $17-an-hour jobs she was offering. “The pay is fair,” she said.
Actually, if not a single person applies for your job, the pay probably isn’t fair. But that’s where America remains stubbornly stuck: Employers won’t pay enough, and workers either won’t or can’t demand more. There are likely a lot of reasons, but the biggest, or least most fixable, may be psychological: From an economic perspective, both sides of the hiring market should have the power to increase overall wages in the current climate—but they aren’t.
It’s funny to laugh, of course, at the cluelessness of employers who are supposed to understand the fundamentals of supply and demand. But their inability to fill jobs is really bad economic news for several reasons. Every unfilled position is a personal tragedy: Imagine what your life would be like if someone in your household lost (or couldn’t find) a payroll job, or hadn’t received a raise in eight years, even as your family’s spending power had shrunk and its costs had grown. And they are likewise a serious economic issue. Americans tend to spend most of what they make on consumer goods, on their rent and mortgage payments, buying cars, investing, and so on. If just 1 million of those 5.7 million job openings were filled at a median pay of $47,000, that would mean an extra $47 billion annually moving from corporate balance sheets into the wallets of Americans. And from there, quickly into the economy.
Why isn’t the labor market functioning the way we would expect? Why aren’t employers bidding more aggressively to fill open positions? And why are American workers, at a time of low unemployment and high job openings, settling for the crappy or nonexistent raises they’re getting?
There could be a skills gap in which the workers out there simply don’t have the training necessary to fill the open jobs. Or it could be that, as Binyamin Appelbaum of the New York Times ventured on Twitter, that “a lot of American businesses have lost the muscle memory of how to compete for workers.” That is to say, they have literally forgotten the words to use, and the tools to deploy, when workers aren’t lining up in droves to fill their positions.
But these aren’t really reasons. They’re symptoms of something deeper at work. The stock market crash of 1929 and the Great Depression left deep scars and influenced consumer and investor behavior for decades. Americans remained risk-averse and shied away from stocks for whole generations after the breadlines and Hoovervilles had faded into sepia-hued memory. We haven’t completely come to grips with it, but the financial panic of 2008 and the Great Recession that followed inflicted similarly deep wounds on both businesses and workers that have changed behavior and norms—and it’s those norms that are depressing wages more than anything else.
How did we get here?
We live in an age of long business cycles and rare and shallow stumbles: In the 25 years between November 1982 and December 2007, there were only two recessions, each of which lasted just eight months. But the 18-month recession of 2008 to 2009, and the remarkably destructive debt crisis that fell in the middle, led to near-death experiences for many companies (and real-death experiences for large chunks of the banking and auto industries).
Corporate America went into survival mode and took an axe to what, in most instances, was its largest single cost: labor. Between January 2008 and February 2010, private-sector companies slashed 8.8 million jobs. At the same time, they slashed the wages and benefits of the workers they continued to employ. This was the playbook for getting through the worst financial downturn since the Great Depression. And as the economy began to expand, companies remained parsimonious on wages and benefits, and continued to push the obligation and cost of training onto workers.
They were able to do that in part because there was immense slack in the labor market. The unemployment rate peaked at 10.0 percent in October 2009. And so for a long period of time, companies became accustomed to getting all the labor they needed at the (low) price they wanted to pay, and managed to hold onto staff despite not raising wages. This mentality hardened into something like a permanent mindset, incorporated into business models and pro forma projections. In an era when overall economic growth was slowing, companies simply couldn’t countenance raising wages consistently.
Businesses encountered surprisingly weak countervailing forces. Only 6.4 percent of private-sector workers are represented by unions who can bargain on their behalf, and many of those unions are in a permanent defensive crouch. While several states raised the minimum wage, the federal wage floor remained stuck at $7.25.
At the same time, a similar scarring was happening on the other side to workers at every skill level, in every profession, at every rung of the income ladder. Given the weak safety net and low level of savings, the massive job losses and long-term unemployment suffered in 2007–2009 were devastating. Foreclosures and bankruptcy filings spiked.
In the wake of these losses, the mindset changed. What became paramount to the traumatized is simply to have a job with a steady paycheck, and to worry less about wages and the potential for raises. Someone out of work for a year—and who had difficulty making mortgage payments as a result—is more likely to take the first job he could get, try to hold on to it for dear life, and accept poorer wages because the alternative to not having the job is difficult to imagine. Seven years after the recession, Americans are quitting jobs at a relatively low rate despite the apparent abundance of positions. Out of fear or abundance of caution, they are sticking it out where they are, even if the pay is worse.
This amounts to a mutually reinforcing feedback loop. Companies are psychologically and emotionally geared not to raise wages as a matter of course. And many people who work are reluctant to aggressively ask for higher wages, or to quit and seek a better opportunity.
What can be done about this? In theory, time heals all wounds. People are now entering the workforce, and making hiring decisions, who didn’t suffer through the 2007–2009 crisis (though their older siblings and parents certainly did). After several years of being unable to fill positions at wages they offer, companies are starting to become slightly more willing to loosen the purse strings.
But we desperately need new norms surrounding pay. Policy can certainly help: The $15 minimum wage movement we’re seeing in many cities and states will certainly push some employers to raise wages. It would be more effective if more states and the federal government would follow the lead of states like New Jersey and Alaska, which index the minimum wage to rise with inflation. That way, businesses at the lower end of the wage scale know that they have to expect wage increases every year.
But what’s really needed, for Americans’ sake, is for businesses to stress-test and question their own assumptions. Is it realistic to assume that wages will lag inflation for a decade? Is it reasonable to expect that you can fill open posts—or retain workers—amid high employment without offering wages that are above the market? If the answer to both those questions is “no”—and it is—then you need to redesign your business model and working assumptions.
What’s worked until now won’t work much longer. You need to pay more. You need to do it now.