Future Tense

The Fuzzy Regulations Surrounding DIY Synthetic Biology

It’s unclear exactly what’s legal and what’s not.

Genetically engineering something dangerous is high-level science, generally considered too advanced for community labs.

Jon Feingersh/Thinkstock

Josiah Zayner is the kind of citizen scientist whom journalists like to write about. He left his job as a NASA biologist to start his own company, the Odin, to sell DIY bio products, like a kit for beginners to learn how to genetically modify bacteria and a more specialized CRISPR kit. In photos, he sports piercings and dyed hair. And, more importantly, he makes glow-in-the-dark beer.

In early December, Zayner’s company started selling $160 kits for people to make their own glow-in-the-dark beer using a gene from a jellyfish. The Odin aimed to make genetic engineering an affordable part of everyday life. “We see a future in which people are genetically designing the plants they use in their garden, eating yogurt that contains a custom bacterial strain they modified or even someday brewing using an engineered yeast strain,” the website reads.

But the Food and Drug Administration soon caught wind of Zayner’s product. Soon after the beta release of the fluorescent-beer brewing kit, BuzzFeed reported the FDA reached out to Zayner. The agency, which Zayner hadn’t contacted before selling his product, argued the fluorescence could be classified as a color additive and therefore falls under FDA regulation. Zayner countered that the kits should be safe from regulation because they weren’t directly selling a food product. And he kept selling them.

He still believes the products themselves don’t fall under FDA regulation, and even though the FDA informed him the marketing of the kits could, he says he hasn’t received any warnings from it since. You can still purchase the kit on the Odin website. “The regulation is super amazing for biohackers, which is surprising,” he told me. “We can sell our CRISPR kits and things and nobody bothers us.”

At the moment, there is a lot of uncertainty surrounding the regulation of the DIY biology movement. DIY biologists, a subset of the biohacker community, mess around with DNA, often in communal lab settings where they share their techniques, knowledge, and discoveries with other science enthusiasts. As Alyssa Sims noted in Slate in January, “They’re creating spaces to support community engagement with, and the democratization of, science. In effect, then, they are questioning the power, authority, and hierarchy of academic institutions.”

But with that authority-questioning approach comes sticky questions of legality, ethics, and regulations. According to Gizmodo, in November, officials from Germany’s consumer protection office gathered to discuss the cheap, easy-to-access genetic engineering kits of the type Zayner’s company sells. They concluded they needed to issue a warning, reminding biohackers of a law already on the books that bans genetic engineering experiments outside licensed labs. Biohackers, the government informed them, could face a fine of roughly $55,000 or up to three years in prison.

It’s not uncommon for European countries to have these licensed-labs laws, Todd Kuiken, a senior research scholar at North Carolina State University’s Genetic Engineering and Society Center, said. (Kuiken warned that Germany should not, however, be taken as a proxy for Europe in general: “In Germany, which has a unique and sordid past with scientific experimentation—you also have to take that into account when it comes to scientists or researchers manipulating genetics.”) There’s been no crackdown, and community labs can themselves apply for licensing, but it’s not as freewheeling as the U.S. regulatory system. In the U.S., practicing DIY biology is, for the most part, perfectly legal. As Sims noted, most abide by a code of ethics, drafted in 2011 by the North American DIYbio Congress, that emphasizes transparency, safety, and responsibility. But those ethics aren’t legally binding.

The U.S. regulatory system is set up to focus on the products of scientific endeavors, rather than the processes behind them, according to Sarah Carter, a science policy consultant who has helped craft policies for the National Academies of Sciences, Engineering, and Medicine and the National Institutes of Health. Carter calls the product-focused model “the underlying philosophy of the U.S. biotech system.” (One notable exception: If a scientist uses an agrobacterium, a type of bacteria that’s a historically common vehicle for transferring DNA to plants, then she’s treading into regulatory territory for using what the Agriculture Department classifies as a plant pest. Gene guns, which are another common technique for gene-editing plants, do not fall under this regulation.)

So what’s to stop you from being afraid of deranged basement scientists or incompetent, unsupervised students handling dangerous pathogens? For now, at least, there’s the limitation of the science itself. Genetically engineering something dangerous is high-level science, generally considered too advanced for community labs. Citizen scientists sometimes dream big, but occasionally, the reality of scientific limits can bring those dreams crashing down.

No story serves as a better warning for the dangers of hyping biohacking too much than its most famous fizzle, the glowing plant project. Three biohackers garnered a lot of hype in 2013 when they vowed to develop genetically modified tobacco plants that glowed in the dark. Through Kickstarter, where they promised to eventually send plants to donors, they raised almost $500,000. People were excited. (Kickstarter was less enthusiastic. Soon after the glowing plant project was funded, Kickstarter updated its rules to say “Projects cannot offer genetically modified organisms as a reward,” though the plant project was grandfathered in.) But the scientific limitations caught up with the project, and in April, the one founder remaining with the project announced that it had run out of money and that the endeavor had ended. Nobody received glowing plants.

As Dan Grushkin, who runs GenSpace, a community biology lab in New York, wrote in Slate in 2013: “DIYers are a long way from engineering pandemics, which is only possible in sophisticated biolabs, or customizing viral assassins, which is only possible in science fiction.”

Not much has changed in the past four years. The labs that do handle contagious diseases are tied to larger institutions and funded by grants that stipulate security measures appropriate for the risk level of the pathogens being used, according to Matt Anderson, a biosafety officer at the University of Nebraska–Lincoln who serves as an adviser to DIY biology labs. Those familiar with the work done in the DIY biology community emphasize that these labs aren’t messing around with contagions. DIY biologists often practice self-restraint “because you don’t want too much government scrutiny,” Anderson said. The community biotech labs even follow standard biosafety rules you would find in any university lab.

Nor is community DIY bio scientifically advanced enough, yet, to fabricate gene drives, a gene-editing technology that passes on the edits to the offspring and spreads them through an entire population. Gene drives are spurring more conversations about the future of genetic engineering regulations. But for the biohacking community, the ability to successfully create gene drives still seems somewhat distant.

If independent scientists try to sell products created with genetic engineering, however, then regulators can step in. The FDA regulates all genetic modifications to animals like it regulates drugs, and any actual food or drugs have to go through standard regulations. Thus, a dog breeder who tried to genetically engineer healthier Dalmatians ran up against the FDA. But even then, as Zayner with his glowing beer kits learned, there’s some room for debate.

It’s unlikely any of the hazier areas of biohacking regulation will be ironed out in the near future. With any change in an administration, it can take a while for administrative policies to solidify. Biotechnology is an inherently difficult regulatory area. And according to Carter, the science policy consultant, the many unfilled administrative positions in the regulatory agencies have gridlocked policy development. “Everyone I know is in wait-and-see mode,” she said. “It’s hard to predict how the Trump administration will approach biotechnology. I’m starting to think the Trump administration won’t pay attention at all, and it’ll languish and we’ll be stuck with what we have for four years.”

This article is part of the synthetic biology installment of Futurography, a series in which Future Tense introduces readers to the technologies that will define tomorrow. Each month, we’ll choose a new technology and break it down. Future Tense is a collaboration among Arizona State University, New America, and Slate.