Only Weirdos Respond to Opinion Surveys. Can We Still Trust Them?

Who's winning, who's losing, and why.
May 17 2012 7:58 PM

Minority Opinions

Hardly anyone responds to public opinion surveys anymore. Can we still trust them?

Polling station workers in Washiongton, DC.
Has public opinion polling grown less reliable?

Richard Pohle/Getty Images.

When you get a call on your cellphone from an unfamiliar number, do you answer it? If the person on the other end of the line immediately tries to assure you they’re not trying to sell you anything, do you believe them? If they tell you they’re conducting a public opinion survey that will only take a few minutes of your time, do you go ahead and start sharing your views on religion, gay marriage, the economy, and the upcoming election?

Will Oremus Will Oremus

Will Oremus is Slate's senior technology writer. Email him at will.oremus@slate.com or follow him on Twitter.

If you answered “yes” to all those questions, congratulations! You’re among the 9 percent of Americans whose opinions stand in for those of the nation as a whole in public opinion surveys.

The nonprofit Pew Research Center is one of the least biased, most reliable polling organizations in the country. When they tell you that only one-half of Americans believe Mormons are Christians, one in five U.S. adults still doesn’t use the Internet, and that a majority of Americans now support gay-marriage rights, you’d like to think you can trust them.

Advertisement

Recently, though, Pew decided to turn the spotlight on the reliability of its own research. What it found was alarming: Fewer than one in 10 Americans contacted for a typical Pew survey bothers to respond. That’s down from a response rate of 36 percent just 15 years ago.

Why should we care? Two of the top stories in the presidential campaign at the moment are Mitt Romney’s “woman problem” and Barack Obama’s move to support gay marriage. But do women really dislike Romney? And was Obama boldly taking an unpopular stand, or capitalizing on a shift in public attitudes? A new CBS/New York Times poll has raised eyebrows (and the Obama Administration’s ire) by suggesting that Romney actually leads among women—and that most people still oppose gay marriage.

If such polls aren’t reaching a representative subset of the populace, it’s hard to know what to believe. (Disclosure: Slate has teamed up with SurveyMonkey for a series of monthly political surveys. These surveys are different from Pew’s polls in that they’re intended to provide a snapshot of the electorate, not a scientific reading of the nation’s preferences.) This isn’t just a Pew problem. Response rates to telephone surveys—which since the late 1980s have been the standard for polls attempting to reach representative samples of Americans—have been sliding ever since people started ditching their landlines. For many possible reasons—including mobile phones’ prominent caller ID displays and the “vibrate” option—far fewer people these days pick up when a stranger calls at 8 p.m. on a weeknight. Those who do answer their cellphones are often teens too young to be eligible for the polls. And when they do pick up, they’re less likely to hand the phone off to an adult in the household.

Survey outfits’ initial response to the cord-cutting trend in the early 2000s was to ignore it. But the response rates of even those who still have landlines have also dropped off of late. And besides, it soon became clear that calling only landlines created serious problems with their data. Landline surveys, for example, reach more Republicans than Democrats. Given that polls are often judged on their resemblance to actual election results, such findings gave organizations plenty of incentive to bring cellphones into the mix, despite the added hassle and expense. The best pollsters now carefully weight their calls between landline and mobile phones to match their prevalence in the population as a whole. (Though there’s no public cellphone directory, wireless providers make their lists of active numbers available to polling organizations.)

But that hasn’t fully solved the survey nonresponsiveness problem. To understand why, consider the difference between these two statements:

1) One in five U.S. adults doesn’t use the Internet.

2) Of the 9 percent of U.S. adults who respond to telephone opinion surveys, one in five doesn’t use the Internet.

The second sounds less definitive, right? But how much less definitive? We don’t know. And that’s the root of the problem.

A lower response rate, on its own, doesn’t necessarily imply flawed results. In a widely cited 2006 paper, University of Michigan professor Robert Groves—now director of the U.S. Census Bureau—explained how efforts to increase response rates can actually lead to less reliable data. Groves cited a 1998 study in which exit pollsters offered some voters a free pen if they participated. That increased the response rate, but for some reason, Democrats were more enticed by the pens than Republicans, skewing the results.