To believe that the U.S. government planned or deliberately allowed the 9/11 attacks, you’d have to posit that President Bush intentionally sacrificed 3,000 Americans. To believe that explosives, not planes, brought down the buildings, you’d have to imagine an operation large enough to plant the devices without anyone getting caught. To insist that the truth remains hidden, you’d have to assume that everyone who has reviewed the attacks and the events leading up to them—the CIA, the Justice Department, the Federal Aviation Administration, the North American Aerospace Defense Command, the Federal Emergency Management Agency, scientific organizations, peer-reviewed journals, news organizations, the airlines, and local law enforcement agencies in three states—was incompetent, deceived, or part of the cover-up.
And yet, as Slate’s Jeremy Stahl points out, millions of Americans hold these beliefs. In a Zogby poll taken six years ago, only 64 percent of U.S. adults agreed that the attacks “caught US intelligence and military forces off guard.” More than 30 percent chose a different conclusion: that “certain elements in the US government knew the attacks were coming but consciously let them proceed for various political, military, and economic motives,” or that these government elements “actively planned or assisted some aspects of the attacks.”
How can this be? How can so many people, in the name of skepticism, promote so many absurdities?
The answer is that people who suspect conspiracies aren’t really skeptics. Like the rest of us, they’re selective doubters. They favor a worldview, which they uncritically defend. But their worldview isn’t about God, values, freedom, or equality. It’s about the omnipotence of elites.
Conspiracy chatter was once dismissed as mental illness. But the prevalence of such belief, documented in surveys, has forced scholars to take it more seriously. Conspiracy theory psychology is becoming an empirical field with a broader mission: to understand why so many people embrace this way of interpreting history. As you’d expect, distrust turns out to be an important factor. But it’s not the kind of distrust that cultivates critical thinking.
In 1999 a research team headed by Marina Abalakina-Paap, a psychologist at New Mexico State University, published a study of U.S. college students. The students were asked whether they agreed with statements such as “Underground movements threaten the stability of American society” and “People who see conspiracies behind everything are simply imagining things.” The strongest predictor of general belief in conspiracies, the authors found, was “lack of trust.”
But the survey instrument that was used in the experiment to measure “trust” was more social than intellectual. It asked the students, in various ways, whether they believed that most human beings treat others generously, fairly, and sincerely. It measured faith in people, not in propositions. “People low in trust of others are likely to believe that others are colluding against them,” the authors proposed. This sort of distrust, in other words, favors a certain kind of belief. It makes you more susceptible, not less, to claims of conspiracy.
A decade later, a study of British adults yielded similar results. Viren Swami of the University of Westminster, working with two colleagues, found that beliefs in a 9/11 conspiracy were associated with “political cynicism.” He and his collaborators concluded that “conspiracist ideas are predicted by an alienation from mainstream politics and a questioning of received truths.” But the cynicism scale used in the experiment, drawn from a 1975 survey instrument, featured propositions such as “Most politicians are really willing to be truthful to the voters,” and “Almost all politicians will sell out their ideals or break their promises if it will increase their power.” It didn’t measure general wariness. It measured negative beliefs about the establishment.
The common thread between distrust and cynicism, as defined in these experiments, is a perception of bad character. More broadly, it’s a tendency to focus on intention and agency, rather than randomness or causal complexity. In extreme form, it can become paranoia. In mild form, it’s a common weakness known as the fundamental attribution error—ascribing others’ behavior to personality traits and objectives, forgetting the importance of situational factors and chance. Suspicion, imagination, and fantasy are closely related.