There are two basic problems with the so-called Cybersecurity Information Sharing Act, which is scheduled for possible amendment in the Senate on Tuesday. The first is everything the bill, generally approved by the Senate last week, does. The second is everything it doesn’t do.
The bill is so obviously badly written—with overly broad, ill-defined language—that the privacy and consumer groups that long have opposed it increasingly are finding allies in tech companies like Apple, Twitter, and Google, which have gone public with their own opposition. (Disclosure: My employer, R Street Institute, is on record as opposing CISA. So are many of my previous employers and colleagues, including the Electronic Frontier Foundation and the Wikimedia Foundation.)
In effect, the bill aims to sidestep search warrants and other pesky due-process limitations on government by giving technology companies a motive to “share” what it calls “cyber threat indicators” to the Department of Homeland Security. S. 754 gives tech companies—which receive troves of data from Internet users—huge incentives (like protection from legal liability) for “voluntarily” sharing these potential “cyber threat indicators” with government agencies.
What's a “cyber threat indicator”? Section 2 of the bill (full text here) offers a definition so broad that it’s hard to be certain, even after multiple rereadings, what this term doesn’t include. It appears to cover any “information” that would “describe or identify” any “method of causing a user with legitimate access to an information system or information that is stored on, processed by, or transiting an information system to unwittingly enable the defeat of a security control or exploitation of a security vulnerability.”
This language could apply to anything. Example: I already have lawful access to my own computers. But what if someone writes up a cautionary note about how to delude me, perhaps through a phone call, into voluntarily giving over my passwords to these systems. She then sends it to me by private email so I can check whether she’s right. But if she does so, isn’t she describing or identifying a method to cause me, with my legitimate access, to defeat my own security-control tools? The law would allow Google (my email provider) to voluntarily share that private email with DHS. That seems like a bad, unintended outcome.
And as Robyn Greene of New America’s Open Technology Institute explains in detail, other provisions extend the scope of this new kind of surveillance well beyond “cybersecurity”:
[T]he bill would also let law enforcement and other government agencies use information it receives to investigate, without a requirement for imminence or any connection to computer crime, even more crimes like carjacking, robbery, possession or use of firearms, ID fraud, and espionage. … While some of these are terrible crimes, and law enforcement should take reasonable steps to investigate them, they should not do so with information that was shared under the guise of enhancing cybersecurity.
(Disclosure: New America is a partner with Slate and Arizona State University in Future Tense.)
So CISA is expansive about what kinds of information companies would be motivated to share with government agencies. But it provides scant justification for this “sharing.” But “sharing” is plainly a euphemism for surveillance, enabling bulk collection of Internet data that, because it begins with private companies, wouldn’t even require the kind of broad legal authorities that, for example, the Foreign Intelligence Surveillance Act provides.
What CISA doesn’t clearly do is actually prevent or deter Internet crime or espionage. National security expert Patrick Eddington at Cato has pointed out that there’s little evidence that that “sharing cyber threat indicators” will enhance Internet security.
And how would the “shared” information be protected? Given the massive data breach at the federal Office of Personnel Management, maybe it’s not a great idea to give “cyber threat indicators” in bulk to government agencies that have failed to put good security measures in place.
Nor are private companies necessarily any better. Even the Brookings Institution’s Richard Bejtlich, who has testified in previous years in favor of this kind of “voluntary” surveillance, legislation, has been walking back his pro-“sharing” boosterism this year. As Bejtlich wrote in January:
A company with little to no security, focused only on its core business functions, is not going to put threat intelligence to effective use. Until [a] company invests in sound defensive strategy, processes, people and technology, no amount of information sharing will help it.
Worse still, CISA’s liability protections may actually reduce a company’s incentives to clean up its security. As Mike Masnick at Techdirt puts it: “[F]or many companies, the bill just looks like a ‘get out of court free’ bill—because the entire focus is on protecting those companies from liability.”
Drew Mitnick of Access Now underscores that companies cooperating under CISA aren’t just protected from prosecution; they’re also protected from regulators, in exchange for working collaboratively to collect and report information on user behavior. “The transparency requirement,” Mitnick told the Guardian, “is so narrow that, if you met the requirements within the bill to get protection, it would give [participating companies] broad range to collect data and then send it to the government.”
But if CISA is so pro-company, why have some big companies—first through their trade associations and now, more directly, through their official statements, decided to come out against CISA in particular? “We don't support the current CISA proposal,” Apple said in a public statement last week, adding that “The trust of our customers means everything to us and we don’t believe security should come at the expense of their privacy.” And Dropbox’s head of global public policy told the Post that even though “it’s important for the public and private sector to share relevant data about emerging threats, that type of collaboration should not come at the expense of users’ privacy.” The Washington Post’s Brian Fung provides a run-down of tech-company opposition to CISA. As Masnick puts it, “[s]ome companies take a more long-term, customer- or public-centric view of things and recognize all [CISA's flaws].”
Since this summer, the growing opposition to CISA among all sectors has been so remarkable that it’s difficult to explain how the bill has advanced so far with so little effort to address its problems. Part of the answer has been the unstinting, consistent support for CISA from the U.S. Chamber of Commerce, which perceives the bill as a big win for American companies on the liability front.
CISA is one of the few occasions when stakeholders on all sides of the these issues—not just tech companies, privacy groups, and consumer groups, but even some cautious criticism within the DHS itself—find themselves in agreement that CISA’s hastily crafted, overbroad language needs to be reconsidered and revised. DHS’s criticism, produced in response to a formal query from Sen. Al Franken, is particularly damning:
While the Cybersecurity Information Sharing Act seeks to incentivize non-federal sharing through a DHS portal, the bill’s authorization to share with any federal agency ‘notwithstanding any other provision law’ undermines that policy goal, and will increase the complexity and difficulty of a new information sharing program. …
The authorization to share cyber threat indicators and defensive measures with ‘any other entity or the Federal Government,’ ‘notwithstanding any other provision of law’ could sweep away important privacy protections, particularly the provisions in the Stored Communications Act limiting the disclosure of the content of electronic communications to the government by certain providers.
If DHS, of all agencies, thinks maybe your legislation undermines other privacy laws, including the Stored Communications Act (aka the Electronic Communications Privacy Act), isn’t that a good reason to slow down, Congress? More importantly, DHS has put its finger on precisely the issue that always has bugged me about CISA: Why have we focused so hard on reforming a three-decades-old email privacy law and the Patriot Act if we’re going to pass a wholly separate law that erodes those reforms? That “notwithstanding” provision is the clearest thing in an unclear bill; it says that, regardless of what other laws Congress has passed that limit surveillance, Congress now plans to undo those limitations.
The Senate amendments process, set to begin Tuesday, is a chance to raise some of these issues, but hardly all of them. The Center for Democracy and Technology has provided a handy guide to possible CISA amendments that will be brought up on Tuesday, but even CDT seems implicitly to have acknowledged that CISA as a whole holds a first-class ticket on a bullet train toward passage.
Still, the amendments debate Tuesday will give us a few more opportunities to raise objections to overall CISA’s surveillance-friendly language and design. And even if the Senate bill passes in some form, it still will need to be harmonized with its House counterpart in conference committee. For who have problems with CISA’s text or its larger framing of cybersecurity problems, decidethefuture.org will let you email, tweet, or call your senator, while faxbigbrother.com offers a similar service via a decidedly less-fashionable communications platform. Given sufficient outcry, perhaps lawmakers will send CISA back to the drawing board.
This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.