Future Tense

The Lessons of the Crypto Wars

The right to strong encryption almost became law in the ’90s. Here’s what happened.

Goodlatte and Lofgren

Rep. Bob Goodlatte, who in 1996 introduced the SAFE Act in Congress, and Rep. Zoe Lofgren, who has helped to introduce the current amendments protecting the privacy of internet users.

Photo illustration by Slate. Photos by Mark Wilson/Getty Images, Mandel Ngan/AFP/Getty Images.

If recent tech policy debates are any indication, the old axiom is true: History really does repeat itself sometimes.

Earlier this month the House of Representatives overwhelmingly voted in favor of an appropriations amendment to defund any government attempts to require encryption backdoors. Privacy advocates hailed it as an indication of how lawmakers feel about the FBI’s recent assault on encryption. But what some observers may have forgotten is that a similar pro-encryption measure—the Security and Freedom Through Encryption Act—gained an equally impressive show of support as a stand-alone bill in the late 1990s, toward the end of a period known as the Crypto Wars. The story of the SAFE Act serves as a powerful reminder of how the climate surrounding technology can change—and not always for the better.

Ever since Apple and Google announced that they were moving to smartphone encryption by default, technology companies, privacy advocates, and members of the law enforcement and intelligence communities have engaged anew in a fierce debate over the right to use and distribute products that contain strong encryption technology. Many of the arguments that have been made in the past year are reminiscent of those from two decades ago, when the growing availability of commercial encryption technology in the U.S. and around the world prompted a lengthy public conversation about the economic and security trade-offs of limiting its use. My colleagues and I discuss the details in a new report from New America’s Open Technology Institute, which tells the full story of the Crypto Wars of the 1990s and shows how the lessons from that debate are still relevant today. (Disclosure: Future Tense is a partnership of Slate, New America, and Arizona State University.)

The fight began in 1993, when the Clinton administration unveiled the “Clipper Chip”: a state-of-the-art microchip that supposedly offered a way to let the American public use encryption technology while ensuring that law enforcement and intelligence could access encrypted communications for criminal investigations. Officials hoped the proposal would be seen as a sensible solution that balanced two competing interests, national security and personal privacy. They also maintained strict export controls on products containing encryption, which at the time were classified as munitions in U.S. regulations because encryption had historically been used almost exclusively for military and intelligence purposes. But by the end of the decade, the White House faced popular support for access to strong encryption without limits, at home and abroad, including from both chambers of Congress.

Privacy advocates, technical experts, business leaders, and politicians rebuffed the Clipper Chip proposal in 1994—they demonstrated that in addition to significant concerns about its impact on individual privacy and the economy, the technology contained a serious flaw. But many policymakers still believed that key escrow proposals, in which a “trusted” third party would hold copies of the keys in escrow so that they could unlock decrypted communications if authorized, were a viable option. They hoped to convince U.S. companies that key escrow was a good idea and to build support for the policy among allies in Europe so that it might eventually become the global norm.

At the same time the conflict over U.S. export controls on encryption technology started to heat up in 1996. For decades products that contained strong encryption had been controlled under the International Traffic in Arms Regulations and subject to strict export requirements. Practically speaking, these policies either prevented companies from selling software and hardware that relied on encryption overseas or else forced them to export a weaker “export-grade” version of their products, which would be easier for U.S. intelligence agencies to crack. (The problems with “export-grade” crypto resurfaced this year after security researchers discovered two separate security flaws in implementations of the secure HTTPS protocol, FREAK and Logjam, both of which exist because of forgotten weaknesses created by 1990s-era export restrictions.) By the mid-1990s opposition to these restrictions had grown fairly strong. The software industry argued that the controls hampered technology development and undermined U.S. competitiveness, while privacy advocates and technical experts opposed encryption export controls because they weakened security and raised significant free speech and privacy concerns.

In March 1996, Rep. Bob Goodlatte introduced the SAFE Act in the House of Representatives. The bill’s overall goal was simple: “to affirm the rights of United States persons to use and sell encryption and to relax export controls on encryption.” It included provisions that would have barred the government from creating a mandatory key escrow system and would also have removed export restrictions on most generally available software and hardware containing encryption.

But the legislation never actually made it to the House floor for a vote. Multiple versions of the SAFE Act were re-introduced in Congress throughout the late 1990s, with growing support each time. Administration officials had initially defended their encryption policies on the grounds that they helped law enforcement and protected national security. But those arguments grew considerably weaker as the decade wore on, especially in light of growing evidence that the policies were ineffective and hurt the U.S. economy to boot. As Goodlatte explained in the Washington Post in 1997, “Strong encryption prevents crime. … [It] allows people to protect their digital communications and computer systems against criminal hackers and computer thieves.”

During a hearing on the SAFE Act in 1999, congressional representatives called the administration’s existing encryption export control policies “unworkable” and “a failure.” Members of the Senate Armed Services Committee even wrote a “Dear Colleague” letter in May 1999 urging passage of the bill. “If we thought controlling encryption exports worked towards [protecting national security], we would be its strongest proponents,” they wrote. “Unfortunately, export controls on encryption software simply disadvantages the United States software industry.” Technical experts agreed—for instance, the Institute of Electrical and Electronics Engineers endorsed the bill.

Eventually, the SAFE Act gained sponsorship from a majority of the members of the House of Representatives: 258, to be exact. Its proponents argued that the widespread, bipartisan support reflected a broad consensus that the U.S. government should promote access to strong encryption tools, rather than seeking to limit it.

Yet the SAFE Act isn’t law today. That’s because the Clinton administration—in the face of overwhelming pressure from privacy advocates, technology companies, and politicians on both sides of the aisle—abruptly changed course in the fall of 1999 and adopted a policy that did nearly everything the bill had intended to do. In a sweeping shift, the White House announced on Sept. 16, 1999, that it would update its encryption policies to remove virtually all restrictions on the export of retail encryption products. Goodlatte called the reversal “huge” and a “tremendous victory” for encryption advocates. As journalist Steven Levy, who literally wrote the book on the Crypto Wars, described the White House shift: “It was official: public crypto was our friend.”

After the White House announcement in 1999, the bill probably seemed superfluous: The war was over, and the pro-encryption advocates had won. But they may not have realized that we would be on the brink of a similar battle over the right to use strong encryption some 15 years later. That’s why the key takeaway from the conflict is that weakening or undermining encryption is bad for the U.S. economy, Internet security, and civil liberties—and we’d be far better off if we remembered why the Crypto Wars turned out they way they did, rather than repeating the mistakes of the past.

This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.