McCain's Secret Polls
Why do a campaign's internal numbers look so different from the public data?
Most public polls find Barack Obama with double-digit leads in Iowa and Pennsylvania, but members of the McCain campaign cite internal numbers showing a tight race. A campaign might have its own reasons for releasing numbers that favor its cause, but is there any other reason why internal polls might differ from the ones produced for the public?
Not really. In general, media organizations and private campaign pollsters compile their numbers in the same way. But there are a few key differences. First, a newspaper or TV station might be more likely find their respondents with random-digit dialing—calling any phone number that works and then asking whoever picks up whether he or she is registered to vote. Campaign pollsters often save time by pulling their samples from a list of all registered voters. In theory, the pollsters who use the voter files run the risk of missing voters whose information isn't up-to-date, but a study by two Yale professors (PDF) using data from the 2002 elections suggested that these samples were actually a little more accurate than those collected via random phone calls. Still, the difference between the two methods probably wouldn't affect the outcome all that much, particularly given all the challenges in figuring out who counts as a "likely" voter, anyway.
Campaign polls may differ more in their specific focus. At the state level, public polls tend to look primarily at the "top-line" numbers—which candidate is winning overall. An internal poll may ask more questions about voters' demographics, their political leanings, and how they feel about the issues or the candidates. So even if the top-line numbers suggest that a candidate is losing, a campaign pollster could find data that suggest a shift in the race is imminent. A memo released by McCain's pollster Bill McInturff this week falls into that category: McInturff never mentions national head-to-head numbers but, instead, argues that McCain is gaining support among rural voters, non-college-educated men, and "Wal-Mart women." (Internal polls sometimes go on to test a campaign message—for example, by giving a series of statements about the candidates and seeing how voters react. A pollster following ethical standards is obligated to say so if their results are skewed by those messages.)
Even if public polls and internal polls were conducted in exactly the same way, the results we hear about might still be different for a simple reason: Campaigns like to release only good news. Given that there will be a certain amount of random noise from poll to poll, the same methodology could produce several polls with different outcomes. In that case, a media organization would release all the numbers, while the campaign pollster might leak only the data that show his candidate winning. Two different analyses (PDF) using data from the early 2000s found a bias of a few percentage points among partisan polls that had been released to the public—although it's worth noting that outside pollsters can also have a so-called "house effect," favoring one party or another throughout a given year.
Got a question about today's news? Ask the Explainer.
Explainer thanks Whit Ayres of Ayres, McHenry & Associates Inc., Charles Franklin of the University of Wisconsin, Adam Geller of National Research Inc., Alex Lundry of TargetPoint Consulting, Thomas Riehle of RT Strategies Inc., Michael Traugott of the University of Michigan, and Doug Usher of Widmeyer Communications.
Jacob Leibenluft is a writer from Washington, D.C.