The SERDP request is part of a drive across the military to find defense applications for synthetic biology. Last year, the Defense Advanced Research Projects Agency, better known as DARPA, pledged $30 million for what it deems high-value materials and devices made using Living Foundries, and the Office of Naval Research proposed using synthetic biology to produce TNT intermediaries, presumably for weapons.
If successfully built, explosives-producing microbes will come with sticky problems beyond international law. The military has a checkered history of containing its technologies, and the touted environmental benefits won’t keep the new technologies from falling into undesirable hands. Losing track of rifles, Patriot missiles, or drones is bad enough; losing track of self-reproducing factories for explosives is another matter entirely. The quality that makes microbes so powerful will also make them difficult to contain: A single microscopic cell, acquired by a criminal or enemy, could in principle multiply to fill a vat within a few days.
Despite programs that will strike some as ethically equivocal, we should encourage military funding for synthetic biology. Perhaps most importantly, the technology can strengthen our national security by providing alternatives to foreign oil. But we must also be wary. Not all military projects are worth their price—morally, financially, or otherwise. The BWC already delineates how biology may be used in military applications. The U.S. government should carefully consider research funding that may confuse the issue, since other countries, and potential adversaries, might take a cue and aggressively employ biology in decidedly unsavory ways. Proceeding without adequate reflection risks undermining four decades of international moral consensus about appropriate uses of biology. It also threatens our national security.
In writing this, we asked ourselves where on the spectrum do appropriate uses of biotech become inappropriate. It’s a shifting line that may fall into the unsatisfying category of “you know it when you see it.” If as a society we educate ourselves on the potential of the technology and actively monitor its development, we may not know exactly where the line sits, but with vigilance and dialogue we’ll be able to recognize research that crosses it.
This article arises from Future Tense, a collaboration among Arizona State University, the New America Foundation, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.