If I could ensure that kids come away from science class with one thing only, it wouldn’t be a set of facts. It would be an attitude—something that the late physicist Richard Feynman called “scientific integrity,” the willingness to bend over backward to examine reasons your pet theories about the world might be wrong. “That is the idea that we all hope you have learned in studying science in school,” Feynman said in a 1974 commencement speech. “We never say explicitly what this is, but just hope that you catch on by all the examples of scientific investigation.”
Teaching that spirit is easier said than done. “The hardest thing is convincing teenagers they can be wrong,” a high school science teacher from Phoenix lamented to me recently in a conversation about scientific integrity. But to be fair, it’s not just teenagers. We’re all captives of one of the most well-established errors in human reasoning, called confirmation bias: our tendency to focus on evidence that confirms our prior expectations. Once our minds alight on a theory, our impulse is to reassure ourselves it’s true, not set out to disprove it.
For example, researchers has demonstrated that our perception of a speaker depends on whether we’ve been told ahead of time that he’s confident or shy. Our judgment of a child’s academic skill depends on whether we’ve been led to believe that she’s from a rich family or a poor one. When we serve on a jury, we quickly form an impression about whether the defendant is guilty, and then disproportionately interpret new evidence as supporting that impression.
In other words, we need to actively look for signs that our assumptions are wrong, because we won’t do so unprompted. One such sign, scientists have suggested, is the feeling of surprise. “Brains are continuously making predictions,” psychologist Daniel Gilbert explains in his book Stumbling on Happiness—about how a friend is likely to react when you greet her, about what will happen after you knock a glass off the table, even about what sort of word you’re going to see at the end of a sentence. We’re generally not conscious of those predictions, until the world violates them. “Surprise tells us that we were expecting something other than what we got, even when we didn’t know we were expecting anything at all,” Gilbert says.
Surprising observations push science forward. Philosopher of science Thomas Kuhn sometimes called them “anomalies,” observations that don’t make sense under the current paradigm. Eventually they help replace that paradigm with a new one. In the 16th century, an inexplicable kink in the path of Mars across the night sky helped overturn the geocentric model of the solar system in favor of the heliocentric one. And in the 19th and 20th centuries, the unexpectedly fast rotation of Mercury’s orbit helped overturn Newtonian mechanics in favor of general relativity.
Scientists are human, of course, so they feel the temptation to ignore or explain away anomalies. But the ability to resist that temptation is what produces good science. Psychologist Kevin Dunbar has spent his career studying how scientists think, and he’s found that the more experienced a scientist is, the more likely she is to investigate surprising results rather than ignore them. Scientists help one another fight confirmation bias, too. “Individual scientists out of a group context usually attributed inconsistent evidence to error of some sort, and hoped that the finding would go away,” Dunbar reported after observing the operations of several molecular biology laboratories over the course of a year. “However, when the finding was presented at a laboratory meeting, the other scientists tended to focus on the inconsistency to dissect it, and either (a) suggested alternate hypotheses, or (b) forced the scientists to think of a new hypothesis.”
Inspired by the research on the role of surprise in science, I’ve started to keep my own “surprise journal,” to help me notice moments of surprise or confusion, and use them as a cue to examine my assumptions.
After I gave a talk about surprise and confirmation bias in July, the teacher I mentioned, named Charlie Toft, emailed me to tell me he was borrowing my surprise journal idea to test out in his science classes. Over a school quarter, each of his students had to record at least 15 moments in which they experienced surprise, and for each one, ask themselves two questions: Why was this surprising? And what does that tell me about myself?
The results have been encouraging. Toft collected more than 1,000 moments of surprise in total, and he shared some with me. In many cases, students’ surprise stemmed from a mistake that they realized they could have prevented. For example:
Moment of surprise: When we thought we were early to the airport, but we were late.
Why it was surprising: Because we planned ahead before the night of the flight, but we had the time wrong.
What this tells me: that before planning ahead make sure the information you have is correct.
It was also common for students to be surprised at how well or poorly they performed at a task:
Moment of surprise: Found my quince [quinceañera] dance very easy when I thought it was going to be hard.
Why it was surprising: b/c it was new, never seen it before therefore I thought it was going to be hard when it wasn’t.
What this tells me: just b/c it’s new doesn’t meant it’s going to be hard.
And parents of teenagers everywhere will be envious to hear that another recurring theme was the realization that the student’s mother or father wasn’t so wrong after all:
Moment of surprise: I was making french fries, and had forgotten to listen to my mom about lowering the heat, so I burned them.
Why it was surprising: That same day I had been very confident with my cooking, and told my brother I didn’t make cooking mistakes.
What this tells me: I should probably start listening to my mom, when it comes to cooking.
After the exercise, Toft asked his students whether they expected their surprise journaling experience to change their behavior in the future. Seventy percent said yes. “I learned that all of my surprises occurred because I came to the situation with assumptions fixed into my mind,” one student wrote. “I am wrong more often than I think,” another wrote. “Of course I don’t feel wrong. I just realize it after it.”
For Toft himself, the biggest surprise was how the experiment changed the way his students reacted to their own mistakes in class. “In the class culture, acknowledgement that you are mistaken about something has become dubbed a ‘moment of surprise’ (followed by a student scrambling to retrieve their journal to record it),” he wrote to me. “As this is much more value-neutral than ‘I screwed up,’ the atmosphere surrounding the topic is less stressful than in previous years.”
Which, actually, didn’t surprise me to hear. Partly because that’s also been my experience keeping a surprise journal, but also because it echoes what scientists have discovered so far about how to fight confirmation bias. “It’s absolutely threatening to admit that you’re wrong,” says Brendan Nyhan, a political scientist who studies denialism. Many studies by Nyhan and others show that if you help people feel more secure via a “self-affirmation”—such as prompting them to reflect on a personal value they’re proud of—they’ll be more willing to consider arguments that threaten their worldview. Reframing errors as “surprises,” and getting a figurative thumbs-up for noticing them, is just another way to take the sting out of the process.
Toft’s experiment is merely a first step, of course, but it’s a step in the right direction. With the right effort, future science classes could produce a new generation of graduates who react to signs of their own error not with defensiveness or denial, but with curiosity. Or even excitement. As Isaac Asimov once said, “The most exciting phrase to hear in science, the one that heralds new discoveries, is not ‘Eureka!’ but ‘That’s funny ...’ ”