Science

Laboratory Confidential

Should scientists be more forthcoming about their flaws?

Michael Brooks
Michael Brooks, author of Free Radicals

Photograph by Michael Brooks.

Doubt is our product,” a tobacco executive observed in the late 1960s, infamously summing up the effort to undercut the link between smoking and cancer. Since then, many other industries have exploited routine uncertainties or small inaccuracies in research results, creating just enough public doubt to delay—or prevent—government regulation. In recent years, scientists have debated whether and how to respond to this strategy. Should they just let the data speak for itself? Should they drop their habitual caveats—the very-likelys and not-entirely-certains that seem to hang off their every sentence—and start speaking out with more confidence?

Two new books, Free Radicals: The Secret Anarchy of Science by Michael Brooks and Ignorance: How It Drives Science by Stuart Firestein, propose another strategy: Instead of trying to hide their uncertainties, scientists should assert them. It’s time they showed their foibles and flaws for all to see.

In Free Radicals, Brooks proposes that the tobacco strategy derives much of its strength from scientists’ tendency to present themselves as dispassionate and robotlike researchers, and their findings as the product of purely objective experimentation. In reality, he says, scientists are regular people—people who massage their data, brawl with colleagues, or pursue other surprising shortcuts on the way to discovery. Nobel laureate Barry Marshall demonstrated the cause of stomach ulcers by drinking a glassful of bacteria; Kary Mullis, an LSD enthusiast and another Nobel laureate, invented a standard technique for copying DNA by trusting a late-night vision.

By acknowledging these often-slapdash methods and exposing their humanity, Brooks argues, scientists will win more of the public’s trust. He claims that in the wake of “Climategate,” when hacked emails from the University of East Anglia exposed some scientists as being intemperate and suggested that they were prone to using so-called “tricks” in their analyses, most people who changed their views as a result of the scandal became more convinced of the reality of global warming, not less. “People who were unsure about whether to trust scientists got a glimpse of scientists being human—and thought that was OK,” Brooks reasons.

Free Radicals by Michael Brooks.

Free Radicals by Michael Brooks courtesy of Profile Books.

As a science journalist, I’m all too familiar with the ways scientists hide behind PowerPoint presentations and the passive voice. They shy away from personal stories. It’s my job to tease out the source of their inspiration, the messy reality of the research process, and the fallible humans behind their statistically significant results. Those are the unexpected stories that make science interesting and comprehensible to the public. Would we all be better-served if researchers were more forthcoming about their foibles? Yes.

Brooks, however, takes his argument further. Scientists practice a form of “secret anarchy,” he says, a label he insistently—and approvingly, for the most part—applies to everything from recreational drug use in the lab to outright scientific fraud, departmental power plays, and general jerkitude. Science is a creative process, and sure, creativity often leads to some rule-breaking. But not all misbehavior is creative, as Brooks would have us believe. While both Einstein and Galileo fudged the calculations that initially supported their theories, they were the lucky ones. If they’d been wrong, their so-called “anarchism” could have misled science for years. (Consider the discredited British medical researcher Andrew Wakefield, who was so enthusiastic about his theory linking vaccines and autism that he distorted children’s medical histories in order to support it.) Plenty of other, far less brilliant researchers have perpetrated frauds for personal gain, or stolen credit from their hardworking graduate students. That’s not creative expression; it’s cheating.  

Stuart Firestein, author of Ignorance

Stuart Firestein’s Ignorance offers a pithier and more nuanced look at the fallibility of science. Like Brooks, he says that researchers don’t like to follow convention. The much-advertised “scientific method”—observation, hypothesis, and manipulation, and then further observation and a new hypothesis—rarely turns up in practice. “ ‘Let’s get the data, and then we can figure out the hypothesis,’ I have said to many a student worrying too much about how to plan an experiment,” he writes. While data sometimes give an answer, they always lead to more questions—or, as Firestein puts it, more ignorance.

The tobacco strategy is a clever distortion of this habit: Scientists are inveterate doubt-makers themselves, taking intellectual (and sometimes personal) pleasure in finding flaws in their colleagues’ work. The difference is that within scientific circles, doubt is rarely a death sentence. While some very large questions remain about the exact effects of climate change, data collected over the past generation have all but eliminated earlier uncertainties about the human role in rising atmospheric carbon dioxide levels and their effect on temperature. But in the public arena, doubt of any kind can be more dangerous, providing a convincing excuse to wait and see.

Ignorance: How It Drives Science.

Ignorance by Stuart Firestein.

According to Firestein, the kind of uncertainty that drives research isn’t random groping about, as the doubt-makers might have us believe. He quotes the physicist James Clerk Maxwell in describing the practice of science as a form of “thoroughly conscious ignorance.” It’s about choosing the right questions, and making educated guesses where to dig for data. It’s also about a lot of dead ends and some rare, exhilarating discoveries.

The accumulation of such insights eventually leads to things we call facts, and facts are what we can reasonably expect from science: No taxpayer or grant-maker wants to pay for the production of ignorance. But the fine-tuned ignorance Firestein describes is a natural and necessary part of science, so scientists might as well embrace it. It’s not easy to do that in a sound bite, but it’s possible. Instead of downplaying their doubts for public consumption, scientists could tell more stories about the trial-and-error behind their results. They could talk about both the doubts they’ve eliminated, and the doubts that continue to drive them. By carefully defining their ignorance, they could win respect for both what they do know, and what they don’t.