Human Nature

Liberal Interpretation

Rigging a study to make conservatives look stupid.

Are liberals smarter than conservatives?

It looks that way, according to a study published this week in Nature Neuroscience. In a rapid response test—you press a button if you’re given one signal, but not if you’re given a different signal—the authors found that conservatives were “more likely to make errors of commission,” whereas “stronger liberalism was correlated with greater accuracy.” They concluded that “a more conservative orientation is related to greater persistence in a habitual response pattern, despite signals that this response pattern should change.”

Does this mean liberal brains are fitter? Apparently. “Liberals are more responsive to informational complexity, ambiguity and novelty,” the authors wrote. New York University, which helped fund the study, concluded, “Liberals are more likely than are conservatives to respond to cues signaling the need to change habitual responses.” The study’s lead author, NYU professor David Amodio, told London’s Daily Telegraph that “liberals tended to be more sensitive and responsive to information  that might conflict with their habitual way of thinking.”

Habitual way of thinking. Informational complexity. Need to change. Those are sweeping terms. They imply that conservatives, on average, are adaptively weaker at thinking, not just button-pushing. And that implication has permeated the press. The Los Angeles Times told readers that the study “suggests that liberals are more adaptable than conservatives” and “might be better judges of the facts.” Agence France Presse reported that conservatives in the study “were less flexible, refusing to deviate from old habits  ‘despite signals that this … should be changed.’ ” The Guardian asserted, “Scientists have found that the brains of people calling themselves liberals are more able to handle conflicting and unexpected information.”

These reports convey four interwoven claims. First, conservatives cling more inflexibly to old ways of thinking. Second, they’re less responsive to information. Third, they’re more obtuse to complexity and ambiguity. Fourth, they’re less likely to change when the evidence says they should.

Let’s take the claims one by one.

1. Habitual ways of thinking. Here’s what the experiment actually entailed, according to the authors’ supplementary document:

[E]ither the letter “M” or “W” was presented in the center of a computer monitor screen. … Half of the participants were instructed to make a “Go” response when they saw “M” but to make no response when they saw “W”; the remaining participants completed a version in which “W” was the Go stimulus and “M” was the No–Go stimulus. … Responses were registered on a computer keyboard placed in the participants’ laps. … Participants received a two-minute break halfway through the task, which took approximately 15 minutes to complete.

Fifteen minutes is a habit? Tapping a keyboard is a way of thinking? Come on. You can make a case for conservative inflexibility, but not with this study.

2. Responsiveness to information. Again, let’s consult the supplementary document:

Each trial began with a fixation point, presented for 500 ms. The target then appeared for 100 ms, followed by a blank screen. Participants were instructed to respond within 500 ms of target onset. A “Too slow!” warning message appeared after responses that exceeded this deadline, and “Incorrect” feedback was given after erroneous responses.

An “ms”—millisecond—is one-thousandth of a second. That means participants had one-tenth of a second to look at the letter and another four-tenths of a second to hit the button. One letter, one-tenth of a second. This is “information”?

3. Complexity and ambiguity. Go back and look at the first word of the excerpt from the supplementary document. The word is either. Participants were shown an M or a W. No complexity, no ambiguity. You could argue that showing them a series of M’s and then surprising them with a W injects some complexity and ambiguity. But that complexity is crushed by the simplicity of the letter choice and the split-second deadline. As Amodio explained  to the Sacramento Bee, “It’s too quick for you to think consciously about what you’re doing.” So, why did he impose such a brutal deadline? “It needs to be hard enough that people make a lot of errors,” he argued, since—in the Bee’s paraphrase of his remarks—”the errors are the most interesting thing to study.”

In other words, complexity and ambiguity weren’t tested; they were excluded. The study was designed to prevent them—and conscious thought in general—because, for the authors’ purposes, such lifelike complications would have made the results less interesting. Personally, I’d be more interested in a study that invited such complications—examining, for instance, whether conservatives, having resisted doubts about the wisdom of the status quo, are more likely than liberals to doubt the wisdom of change.

4. Maladaptiveness. The scientific core of the study is a hypothesized brain function called “conflict monitoring.” The reason why liberals scored better than conservatives, the authors argued, is that the brain area responsible for this function was, by electrical measurement, more active in them than in conservatives.

The authors described CM as “a general mechanism for detecting when one’s habitual response tendency is mismatched with responses required by the current situation.” NYU’s press release called it “a mechanism for detecting when a habitual response is not appropriate for a new situation.” Amodio told the press that CM was “the process of detecting conflict between an ongoing pattern of behavior and a signal that says that something’s wrong with that behavior and you need to change it.”

The indictment sounds scientific: CM spots errors; conservatives are less sensitive to CM; therefore, conservatives make more errors. But the original definition of CM, written six years ago by the researchers who hypothesized it, didn’t presume that the habitual response was wrong, inappropriate, or objectively mismatched with current requirements. It presumed only that a stimulus had challenged the habit. According to the original definition, CM is “a system that monitors for the occurrence of conflicts in information processing.” It “evaluates current levels of conflict, then passes this information on to centers responsible for control, triggering them to adjust the strength of their influence on processing.”

In experiments such as Amodio’s, the habit is objectively wrong: You tapped the button, and the researcher knows that what you saw was a W. But real life is seldom that simple. Maybe what you saw—what you think you saw—will turn out to require a different response from the one that has hitherto served you well. Maybe it won’t. Maybe, on average, extra sensitivity to such conflicting cues will lead to better decisions. Maybe it won’t. Extra CM sensitivity does make you more likely to depart from your habit. But that doesn’t prove it’s more adaptive.

Frank Sulloway, a Berkeley professor who co-authored a damning psychological analysis of conservatism four years ago, illustrates the problem. Appearing in the Times as a researcher “not connected to the study“—despite having co-written his similar 2003 analysis with one of its authors—Sulloway endorsed the study and pointed out, “There is ample data from the history of science showing that social and political liberals indeed do tend to support major revolutions in science.” That’s true: When new ideas turn out to be right, liberals are vindicated. But when new ideas turn out to be wrong, they cease to be “revolutions in science,” so it’s hard to keep score of liberalism’s net results. And that’s in science, where errors, being relatively factual, are easiest to prove and correct. In culture and politics, errors can be unrecoverable.

The conservative case against this study is easy to make. Sure, we’re fonder of old ways than you are. That’s in our definition. Some of our people are obtuse; so are some of yours. If you studied the rest of us in real life, you’d find that while we second-guess the status quo less than you do, we second-guess putative reforms more than you do, so in terms of complexity, ambiguity, and critical thinking, it’s probably a wash. Also, our standard of “information” is a bit tougher than the blips and fads you fall for. Sometimes, these inclinations lead us astray. But over the long run, they’ve served us and society pretty well. It’s just that you notice all the times we were wrong and ignore all the times we were right.

In fact, that’s exactly what you’ve done in this study: You’ve manufactured a tiny world of letters, half-seconds, and button-pushing, so you can catch us in clear errors and keep out the part of life where our tendencies correct yours. And now you feel great about yourselves. Congratulations. You haven’t told us much about our way of thinking. But you’ve told us a lot about yours.