In the run-up to last year’s Italian elections, the country’s senate did not—I repeat: did not—pass a bill giving legislators 134 billion euros “to find a job in case of defeat.” But a satiric story along those lines spread on social media, and not everyone who passed it along understood that it was a spoof. In just one day, 36,000 people signed a petition against the alleged law. Soon it was being invoked at anti-government protests.
Their confusion caught the eye of a quintet of scholars, who were observing how a large sample of Italian Facebook users engaged with different sorts of stories: articles from the mainstream media, articles from alternative outlets, articles from political activists, and fake news crafted by satirists and trolls. In March, MIT’s Technology Review covered the researchers’ work in a piece headlined “Data Mining Reveals How Conspiracy Theories Emerge on Facebook.” The article began with the tale of that imaginary Italian bill and the people who believed it was real, wrapping up the anecdote with the line, “Welcome to the murky world of conspiracy theories.”
This was an odd way to frame the issue. The rumor involved a bill that had supposedly been passed by the legislature, not a secret plan being hatched by some invisible cabal; it was not in any meaningful sense a story about a conspiracy. The larger study was concerned with the transmission of false stories, whether or not they involve conspiracies; the word conspiracy and its variants appear only four times in the paper. Yet the Technology Review piece brushes past this distinction, then compounds the problem by generalizing rather expansively from the research. “Conspiracy theories,” the writer speculates, “seem to come about by a process in which ordinary satirical commentary or obviously false content somehow jumps the credulity barrier. And that seems to happen through groups of people who deliberately expose themselves to alternative sources of news.” Evidently more than one credulity barrier has been breached.
If Technology Review defined the phrase “conspiracy theory” too broadly, other outlets adopt definitions that are too narrow. In 2013, Fairleigh Dickinson University’s PublicMind Poll concluded that 63 percent of America’s registered voters “buy into at least one political conspiracy theory.” The press duly reported that exact-sounding number, though it wasn’t really accurate: What the survey actually found was that 63 percent of voters believed at least one of the four theories featured in the poll. The number who believe in “at least one” conspiracy is surely far higher.
These aren’t the only times researchers or the reporters who cover them have made this sort of mistake. For decades, psychologists and social scientists have been studying conspiracy theories and the people who believe them. They have unearthed a lot of interesting data, and they have sometimes theorized thoughtfully about the results. But they have repeatedly run into a problem: The world they’re studying is not the same size and shape as the world of conspiracy belief.
Conspiracy theories feature a wide range of masterminds. In The United States of Paranoia, my history of paranoid American folklore, I divided those conspirators into five categories. There is the Enemy Outside, an alien force based outside the community’s borders; the Enemy Within, fellow citizens who cannot be easily distinguished from friends; the Enemy Above, plotting at the top of the power structure; the Enemy Below, conspiring in the underclass; and the Benevolent Conspiracy, which isn’t an enemy at all.
Needless to say, this is hardly the only way conspiracy stories can be sorted. And in practice, those five types frequently overlap with one another: The Enemy Outside, for example, might be accused of pulling the Enemy Below’s strings, as when various prominent Americans blamed the Communist bloc for the urban riots of the ’60s. But it’s a useful typology, with plenty of historical examples of each kind.
In these studies, though, Enemy Above stories tend to be overrepresented. And that in turn can skew the results. When researchers draw conclusions about people who are especially prone to seeing conspiracies, they might actually be telling us about people prone to seeing a particular kind of conspiracy.
Sometimes this bias is stated baldly. In 2010, for example, the Rutgers sociologist Ted Goertzel wrote an article for EMBO Reports, a journal of molecular biology, that said conspiracy logic tends to “question everything the ‘establishment’—be it government or scientists—says or does.” He backed this up on the rather thin grounds that a recent pop text, The Rough Guide to Conspiracy Theories, mostly discusses theories about “political, religious, military, diplomatic or economic elites.”
But that “establishment” has conspiracy theories of its own, even if the Rough Guide overlooked them. At moments of moral panic, it is common for the government and the mainstream media to blame a folk devil—frequently cast in conspiratorial terms—for a real or alleged crisis. Examples range from the white slavery panic of a century ago, when a vast international syndicate was believed to be conscripting thousands of girls into sexual service, to the Satanism scare of the 1980s and early ’90s, when politicians, prosecutors, juries, and the press were persuaded that devil-worshipping cabals were molesting and killing children. Often the conspiracy stories believed by relatively powerless people are mirrored by conspiracy stories believed by elites. At the same time that American slaves were afraid that white doctors were plotting to kidnap and dissect them, the planter class was periodically seized by fears of slaves secretly plotting revolution. While the Populist Party was denouncing East Coast banking cabals, many wealthy Easterners were wondering whether a conspiracy was behind Populism.
With that in mind, consider the academic literature on conspiracy believers. In 1992 Goertzel surveyed 348 residents of New Jersey about 10 conspiracy theories that were circulating at the time. Seven of the 10 were Enemy Above theories, in which the government was guilty of murdering Martin Luther King, deliberately spreading AIDS, covering up UFO activity, or otherwise injuring the public interest. Two more—one where a conspiracy killed John F. Kennedy, one where Anita Hill was part of a plot against Clarence Thomas—could take either an Enemy Above form or another shape, depending on the version of the story the person surveyed believed. Only one of the 10 was definitely not an Enemy Above theory: “The Japanese are deliberately conspiring to destroy the American economy.” (That one was, interestingly, one of the most popular items in the list, with 46 percent of respondents declaring it either definitely or probably true.)
This does not mean that Goertzel’s data are useless or that he didn’t produce an interesting paper. But when he writes, say, that conspiratorial beliefs are correlated with anomie and insecurity about unemployment, has he really uncovered a couple of conspiracist traits? Or has he simply been asking about conspiracy theories that people experiencing anomie and economic insecurity are more likely to believe?
Goertzel also noted, “People who believed in one conspiracy were more likely to also believe in others.” This idea has become a staple of the literature: As Michael Wood, Karen Douglas, and Robbie Sutton put it in a 2012 paper for Social Psychological and Personality Sciences, “the most consistent finding in the work on the psychology of conspiracy theories is that belief in a particular theory is strongly predicted by belief in others—even ostensibly unrelated ones.” It has become a staple of pop-science coverage too, appearing in venues ranging from Bloomberg to Newsweek.
Anecdotally speaking, it’s a plausible idea: While everyone is capable of conspiracy thinking, some people do seem more prone to it than others. But are they really more likely to embrace conspiracy theories in general, or just conspiracy theories of a certain sort?