Lessons of the New Hampshire polling fiasco.

Science, technology, and life.
Jan. 11 2008 3:41 AM

Bad Calls

Lessons of the New Hampshire polling fiasco.

Illustration by Robert Neubecker. Click image to expand.

Based on what you've heard this week about the performance of pollsters in New Hampshire, are you somewhat likely, quite likely, or very likely to ignore political surveys for the foreseeable future?

William Saletan William Saletan

Will Saletan writes about politics, science, technology, and other stuff for Slate. He’s the author of Bearing Right.

I recommend "none of the above." You can learn a lot from polls, especially when they're wrong. The New Hampshire pollsters are full of excuses, and their excuses are full of lessons. Let's look at a few of them.

Advertisement

1. It's standard polling error. The polls' "margin of sampling error … made an eight-point error in either direction possible," argues Janet Elder, editor of news surveys for the New York Times. If there were only one poll, that explanation might fly. But in this case, nine polls converged on an Obama advantage of 5 to 13 points. The probability of sampling error producing that kind of convergence is infinitesimal.

2. Privately, we had it right. "My polling showed Clinton doing well on the late Sunday night and all day Monday—she was in a 2-point race in that portion of the polling," pleads pollster John Zogby. "But since our methods call for a three-day rolling average, we had to legitimately factor the huge Obama numbers on Friday and Saturday—thus his 12 point average lead."

Zogby had Hillary pulling even? Let's check the headline on the press release he issued Tuesday morning: "Obama, McCain Enjoy Solid Leads As Election Day Dawns." Here's his first sentence: "The big momentum behind Democrat Barack Obama … continued up to the last hours before voters head to the polls." And here's the first quote from Zogby: "Obama's margin over Clinton has opened up."

Let's be real. Zogby, like most of us, expected an Obama blowout. His Sunday-Monday subsample didn't match that expectation or anyone else's polling. He decided it was too small to report. Now that it matches the election returns, he's touting it.

Lesson: Tell us your private numbers before the election, including breakdowns of your rolling sample by day. Give us your warnings about sample size, and let us do the judging.

3. We told you, sort of. The president of the American Association for Public Opinion Research points out that on Monday, "CBS News Polls cited that '28% of Democratic voters say their minds could still change.'" But that warning, which was featured high in the polling unit's six-page data report, was buried in the network's press release. " CBS Poll: Obama Leaps Ahead In N.H.," the release shouted. The first sentence said Obama had "opened up a seven-point lead" on Clinton.

Lesson: Don't oversimplify the data.

4. We misjudged the turnout. Rasmussen Reports, another firm that blew the primary, speculates that "polling models used by Rasmussen Reports and others did not account for the very high turnout." For instance, "Rasmussen Reports normally screens out people with less voting history and less interest in the race. This might have caused us to screen out some women who might not ordinarily vote in a Primary but who came out to vote due to the historic nature of Clinton's candidacy." The firm allotted 54 percent of its final weighted sample to women. In reality, women cast 57 percent of the votes.

  Slate Plus
Slate Picks
Dec. 19 2014 4:15 PM What Happened at Slate This Week? Staff writer Lily Hay Newman shares what stories intrigued her at the magazine this week.