Future Tense

Forget Information Sharing

If Congress is worried about cybersecurity, it should start by changing these three laws.

Aaron Swartz 2008.
Aaron’s Law—named after Aaron Swartz, pictured above in San Francisco in 2008—would fix some of the issues in the law, but it has languished in committee since it was introduced.

Photo by Noah Berger/Reuters

Before it adjourned in August, Congress delayed a vote on the controversial and dangerous Cybersecurity Information Sharing Act. This is definitely for the best—CISA is deeply flawed legislation that would do more to put citizens’ private data in the hands of the government than to actually make progress on critical cybersecurity issues. Information sharing—that is, sharing of information about cybersecurity threats among private companies and between companies and the government—was supposed to be the low-hanging fruit that Congress could tackle quickly, before turning to larger issues. But the information-sharing bill has been hanging around for five years. And CISA isn’t just worrisome for privacy rights advocates—it’s not even clear that information sharing would have prevented or ameliorated many of the large breaches we’ve seen lately.

But there are much more effective strategies for protecting valuable data. One of our greatest cybersecurity assets is “white hats”—security engineers who try to break the software we use every day so they can discover weaknesses that ought to be fixed. They’re like a squad of friendly locksmiths who walk around neighborhoods performing free lock audits on everyone’s front doors. (Not to mention helping develop new types of locks, working with lock companies to fortify the ones out there … it’s an imperfect analogy.) You’d probably thank the locksmiths—maybe even give them cookies.

But multiple well-meaning federal laws had unintended consequences of stifling this kind of cybersecurity work. Let’s follow a hypothetical security researcher through the three laws that Congress ought to amend right now if we want to encourage, rather than dissuade, her research:

1. The Computer Fraud and Abuse Act (18 U.S.C. 1030)

Ironically, the federal anti-hacking provision can discourage security research. The problem here is that the untrained eye has a very difficult time distinguishing security research from malicious attacks. Both actions involve trying to break into computers or networks—the differences are in what motivates the actor, and in what she does with the information after discovering a security flaw. Those are admittedly difficult questions to base a criminal statute on (though the concept of mens rea, or the mental state of an accused criminal, has been a crucial piece of criminal law for centuries), but the CFAA doesn’t even attempt to take them into account.

Congress could improve this law by adding a threshold test that attempts to recognize the legitimacy of security research and carve out behaviors that are clearly aimed at improving security. Aaron’s Law—named after Aaron Swartz, who tragically killed himself while under what many believed was an unwarranted indictment for violations of the CFAA—would fix some of the issues in the law, including clarifying that simple terms of service violations do not rise to the level of a criminal act. Unfortunately, Aaron’s Law has languished in committee since it was introduced. But laws should also grant immunity to any security researcher that reported or attempted to sell a discovered vulnerability to the owner or maintainer of a piece of software, instead of selling it on a black market.

If our hypothetical researcher gains unauthorized access to a computer network or system (which is the entire point of her exercise), she will have violated the CFAA. Violations of the CFAA can carry sentences up to 10 years.

2. The Digital Millennium Copyright Act (17 U.S.C. 1201)

Under the DMCA, it’s unlawful to break a protection measure put in place to prevent a person from accessing copyrighted material. The law was originally designed to combat copyright infringement by preventing people from doing things like descrambling DVDs to make bootleg copies. But in the 17 years since its passage, the DMCA has proved susceptible to abuse by companies that don’t like people to tamper with their stuff, even in ways that have nothing to do with copyright infringement. For example, just a few months ago, a researcher who exposed severe security flaws in supposedly high-security electronic locks made by a company called CyberLock got a letter from the company’s lawyers threatening legal action under the DMCA. Threats like these may never materialize into actual lawsuits or prosecutions but could nevertheless make a well-meaning engineer think twice before engaging in security research or reporting the security flaws he discovers.

If our friendly researcher subverts a security system protecting copyrighted material (and the fact of copyright law today is that most if not all Web servers contain some form of copyrighted material), she will be violating the DMCA’s anti-circumvention provisions. Violations of the DMCA are civil infractions and carry statutory damages of up to $2,500 per act of circumvention.

3. The Electronic Communications Privacy Act (18 U.S.C. 2701)

ECPA does a whole swath of good things to protect our electronic communications (though it still needs some work to bring it up to speed with the way the modern Internet works). On the downside, though, ECPA—like the CFAA and DMCA—does not seem to have been written with security researchers in mind. One particular provision of the law, Section 2701, prohibits breaking into a computer network or system and accessing, altering, or denying access to private communications. This helps prevent unlawful spying, but just like the CFAA, ECPA doesn’t have any exceptions for researchers who might be doing similar work. Congress ought to fix ECPA to carve out behavior we want to incentivize by requiring a certain intention on the part of the defendant or by granting immunity to those who report their findings to help fix the vulnerability.

If our security researcher comes across the private communications of users, even if she doesn’t read them or otherwise tamper with them, she will have violated ECPA. Violations of ECPA can carry up to five years in prison for a first offense and 10 years per subsequent offense.

Bonus: The Wassenaar Export Control Arrangement

While it isn’t a law that Congress can amend, the United States is party to a multinational agreement called the Wassenaar Arrangement that could potentially threaten security research. The Wassenaar Arrangement aims in part to limit the sale of software and technology that could be used for surveillance to repressive regimes around the world. This agreement, which the Department of Commerce is working to implement into regulations right now, could be important for the preservation of human rights and civil liberties, particularly in those places where those protections are most needed.

Unfortunately, Commerce’s first implementation proposal was broad enough to present concerns for security research. The proposed rules would have forbid export of a fairly large category of software that is routinely used in investigating systems for vulnerabilities. Fortunately, the Commerce Department recognized the potential for chilling research presented in the proposed rules and is seeking comment and revising the rules. New America’s Open Technology Institute, where I work, submitted comments on the regulations and we’re hopeful that the next round of rules released will strike a balance between limiting access to surveillance software by repressive regimes while still encouraging robust security research. (New America is a partner with Slate and Arizona State University in Future Tense.)

So, in total, our researcher may find herself facing up to 20 years in prison and $2,500 per each offense, which can stack up depending on how many violations an overly litigious company decides it can plausibly claim. That’s not even including whatever violations might accrue under the new regulations soon to come out of the Department of Commerce’s Wassenaar rulemaking. It’s an intimidating result if you’re a small independent researcher. If we want to solve our nation’s cybersecurity problems, we could start by encouraging those who go out of their way to find and report vulnerabilities with an eye toward making us all more secure.

This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.