Future Tense

Strain Game

Could information about a lab-made virus really help evildoers create a biological weapon?

Do we need to fear bioterrorism?
Do we need to fear bioterrorism?

Photograph by Chad Baker/Ryan McVay.

Here is one of the scariest things you’ll ever read:

atggagagaataaaagaattaagagatctaatgtcacagtcccgcactcgcgagatactaacaaaaaccact
gtggaccatatggccataatcaagaaat

These are the first 100 units of a gene in an influenza virus. This particular flu virus belongs to a strain called H5N1. It breeds and spreads among birds, but on rare occasion, it can infect people. And when it does, it is frighteningly fatal, with a mortality rate of about 60 percent. Since the virus was first spotted in Hong Kong in 1997, birds have spread it to many countries. On Dec. 19, it claimed its latest victim, a 29-year-old Egyptian man who probably contracted it from the chickens in his backyard. The only consolation for such deaths is that there are not more of them. The virus has proved unable to spread from person to person since it first emerged 14 years ago.

In the decade and a half since, scientists have put great effort into understanding the virus, worried that it might evolve into a pandemic that could cause a worldwide disaster. But now some new experiments on bird flu have plunged the scientific community into a debate about the risks of learning—and sharing—the virus’s secrets. Researchers have produced a variant of H5N1 that reportedly can spread from one mammal to another. So far they’ve documented its spread in ferrets; no one knows what it would do if it got out of the lab. But fearing the information could be used to create a biological weapon, a federal advisory board has taken the unprecedented step of calling for some of the results to be withheld from any published papers about the work.

A number of scientists agree with the board, saying that publishing too much information about this research could make it possible for terrorists to weaponize the flu. But others are warning that withholding data won’t make us safe, and that the best defense against bioterror—both human and natural—is transparency. Steffen Mueller, a virologist at Stony Brook University, says, “I much prefer dealing with the devil I know over the devil I don’t.”

Transparency is one of the most cherished values in modern science; it allows scientists to build on one another’s research—and check each other’s work. In 2004, for example, an international team of scientists isolated an H5N1 virus and sequenced all 10 of its genes. They described the virus in the journal Nature, and they uploaded the entire genome to a website run by the National Library of Medicine. And so, if you’re a fan of this genre of virus horror nonfiction, you can read the whole thing for yourself. (Unlike the new H5N1 strain, this one can’t be transmitted from one mammal to another.)

But over the past decade, scientists have gotten worried that this kind of research could help someone trying to build biological weapons. In 2001, someone—we still don’t know who—unleashed terror by sending anthrax spores through the mail. In 2002, Eckard Wimmer, a Stony Brook University virologist, and his colleagues made headlines by synthesizing the genes for a poliovirus from scratch, and then making new viruses from them.* Their work raised the prospect of designer pathogens being created expressly to cause harm. The cost of sequencing and synthesizing DNA has crashed year after year, making biological engineering easier for people to do. All of these swift changes in biology led to the formation of the National Science Advisory Board for Biosecurity in 2004, made up of some of the country’s leading experts on microbiology and biological warfare.

Part of their charge was to review potentially dangerous research projects—what’s often called “dual use” research for its two-sided potential to be used for good or evil ends. For the first seven years of its existence, the board approached its mission with a very light touch. In 2005, for example, the board examined a study in which U.S. government scientists revived the 1918 virus. They had no objection to the scientists publishing the details of their research. In fact, you can see its genome on the Internet, too.

But now a pair of new studies has roused the board to action. Two teams of researchers, one in the Netherlands and one at the University of Wisconsin, have run experiments to find mutations that can turn H5N1 from a bird flu to a mammal flu. They’ve carried out their experiments on ferrets, which respond to flu viruses much like humans do. What few details we know of the unpublished research comes from a talk Dutch virologist Ron Fouchier gave in August at a virology conference, along with subsequent news reports. Fouchier began the experiment by altering the H5N1 virus’s genes in two spots. Then he passed the virus from one ferret to another, allowing the virus to mutate and evolve on its own inside the animals. After several rounds, Fouchier ended up with an H5N1 virus that could spread through the air from one ferret to the other. If unleashed—and if proven capable of spreading from human to human with the same high mortality rate—it could make the deadly 1918 pandemic look like a pesky cold.

The research is funded by the National Institutes of Health, and according to Anthony Fauci, director of the National Institute of Allergic and Infectious Diseases, NIH staffers were alarmed by the results. The studies were passed to the National Science Advisory Board for Biosecurity, which hashed out the matter for weeks. On Dec. 20, the panel urged that “conclusions of the manuscripts be published but without experimental details and mutation data that would enable replication of the experiments.”

The board has no legal, binding authority, Fauci points out. But it’s clear that the authors, the journals, and the NIH are taking the board’s unprecedented vote very seriously and are working on a way to follow its recommendations. They’re figuring out a system that will allow qualified—and fully vetted—researchers to see the full data.

A number of scientists outside the board have applauded the board for ensuring that bioterrorists don’t get their hands on information they could use. Some researchers question whether the experiments yielded enough useful information to justify the risks. Ian Lipkin, the director of the Center for Infection and Immunity at Columbia University, believes there’s no reason to assume that the mutations that arose in Fouchier’s experiments would be the ones that would arise out in the real world. “On the other hand,” Lipkin says, “publishing this information would give people a roadmap to creating Frankenstein viruses.”

The scientists who are concerned about this information getting out aren’t claiming that terrorists could simply read the genome of Fouchier’s virus, synthesize new ones, and unleash a biological attack. For one thing, Fouchier’s viruses only appear to be able to go from one ferret to another. And ferrets, as good as they may be for studying the flu, are still ferrets. Released into the real world, the viruses might fail to spread from human to human.

Rather, the concern is that publishing the new studies in full would offer a useful starting point for someone who’s trying to turn H5N1 into a biological weapon. They could add more mutations to the ones Fouchier has identified, using the same methods of passing viruses from one animal to another until they hit the virological jackpot.

There are a lot of reasons to consider this scenario unlikely: The experiments that have the NASBB so concerned weren’t simple; they were carried out in some of the world’s most sophisticated virology labs. It’s conceivable that someone could just try breeding flu viruses simply by transferring them from one animal to another in a low-tech experiment. But if H5N1 turned into a human flu along the way, the people who were breeding it might well be the first to die.

But let’s assume for the moment that the risks are big enough to worry about. Is holding back information the best way to eliminate them? Wimmer doesn’t think so, since the gist of the experiments have already escaped. If some villain has enough money, he or she can just run a similar experiment. “Whether details of the Dutch experiments will be published or not would not matter,” he says.

“Not publishing the sequence is a rather silly attempt of containment,” says Mueller. He argues that scientists should head in the other direction, to prepare both for any possible bioterror, and for nature’s own attacks. “Not only should this be published completely, I think these experiments should be repeated numerous times,” he says.

For Wimmer, who built polioviruses nine years ago, the controversy over the flu viruses gives him an intense case of déjà vu. His virus creations caused a huge uproar. There was even a movement in Congress to condemn it. But Wimmer had a very good reason for synthesizing viruses from scratch. Working with Mueller and others, he’s turned synthetic viruses into promising vaccines against diseases such as influenza. We may fear the risks that come with scientific progress, but Wimmer’s work reveals the dangers of fear itself. When the next bird flu comes—and it will come—it may be Wimmer’s Frankenstein viruses that save us.

This article arises from Future Tense, a collaboration among Arizona State University, the New America Foundation, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.

*Correction, Dec. 22, 2011: This article originally misspelled Eckard Wimmer’s first name.