Is there any part of the U.S. government that is not looking to ramp up its involvement in cybersecurity issues? The latest organization to join the likes of the Department of Defense, Department of Homeland Security, Department of Treasury, Federal Trade Commission, and National Institute of Standards and Technology in the quest to secure cyberspace is the Federal Communications Commission, whose chairman, Tom Wheeler, gave a speech at the American Enterprise Institute on June 12 outlining the need for a “new paradigm for cyber readiness.” There’s a cybersecurity turf war in the making for government departments and agencies, but no one seems to be offering concrete actions, steps, or plans for addressing these issues.
Take the FCC’s “new paradigm”—a “private sector–led effort” that Wheeler said “must be more dynamic than traditional regulation and more measurably effective than blindly trusting the market or voluntary best practices.” He added that it “must be real and meaningful. It has to work.” In other words: It has to be better than what we have now. Now, Wheeler’s not wrong—addressing cybersecurity will require more dynamic defense and a better grasp of security metrics and measurement than we currently have—but so far there are no specifics, no direction, no clear vision for how he, or anyone else in the U.S. government, is planning to achieve those goals. And that’s a pity, because he, and the FCC and the other government actors interested in helping industry protect against cybersecurity threats, are actually in a position to add some pretty vital pieces to this puzzle. But those pieces have nothing to do with abstract concepts of paradigms and frameworks—they have to do with facts and data.
So what should the government’s role—or roles—be here? A government agency that collected data from firms about the threats they face, the security incidents they experience, the vulnerabilities they see exploited, and the techniques and tools they use—both successfully and unsuccessfully—could be a tremendous force for good here. It could use that data to talk in very specific terms about what works and what doesn’t when it comes to preventing and detecting intrusions, about what engineering design decisions might help mitigate the most serious threats and vulnerabilities we see, and where researchers should be focusing their attention.
It’s true that such data collection and analysis efforts could be industry-led, but for that to happen, companies would have to share data voluntarily with their competitors and release information about security incidents that could be embarrassing and even damaging to their reputations. The government could try to facilitate this in a number of ways—anonymizing reported data, acting as an aggregator for data from competing companies, setting a standardized template for what information should be reported and how. Existing and proposed data breach laws tend to focus on notifying and protecting consumers but often do not include measures to help us learn from those incidents about what went wrong and what could be done better next time. Protecting people from credit card fraud and identity theft is important in the wake of a security breach, but so is digging into the root cause of the breach and assessing what defenses were in place when the breach occurred and how they were bypassed or circumvented. Companies may choose to do that analysis individually to learn from their mistakes, but that information has much broader value when it can be pooled with data about breaches at other firms to find trends and inform a wider audience.
There’s a tremendous amount that a government actor like the FCC could potentially contribute in this space, which is why it’s so frustrating to see them instead wielding policy and regulatory interventions as a threat rather than an opportunity. Of course, companies are always reluctant to be regulated, and nothing Wheeler says is likely to change that instinct. All the same, there’s something demoralizing about seeing that perspective so strongly reinforced by the regulators themselves—especially when talking about an area where the government has so much to offer.
Wheeler calls out the “communications sector” as the target of his remarks but doesn’t specify exactly which kinds of firms are included in his message, and whether they go beyond the service providers typically regulated by the FCC. Many companies, across all sectors, are already subject to data breach notification laws in several states, and the Obama administration has pushed for a federal standard governing incident reporting in the wake of the Target breach, but the focus of those efforts has not been on learning about security threats and trends. Instead, they aim to help consumers receive prompt notification when their information may have been released. Speaking to the Senate Judiciary Committee in February to encourage a federal standard, Assistant Attorney General Mythili Raman said: “American consumers should know when they are at risk of identify theft or other harms because of a data security breach.” That’s a valuable role for the government to play, but it’s not the only one—and it’s not the only thing that needs to happen after a security breach.
In encouraging companies—and particularly the large telecommunications companies—to usher in this new paradigm, Wheeler warned: “It has to work. The commission’s commitment to market accountability will help ensure that it does work. And, while I am confident that it will work, we must be ready with alternatives if it doesn’t.” The implied message to industry: You fix this, or we’ll fix it for you. That strategy of scaring companies to force them to act may effectively play to industry’s fear of regulation, but it doesn’t set the stage for a future in which problems are addressed with a mix of private and public solutions. He’s perpetuating the idea that the government has nothing positive to offer industry when it comes to addressing computer security threats and the feeling that industry, rather than being helped along, is being bullied.
Wheeler immediately followed on those comments by cautioning headline writers not to start scrawling “FCC wants to regulate cyber” across the front page. And certainly that’s not the message of his talk—if anything, his comments seem to suggest an FCC that desperately does not want to regulate in this area. But the press did pick up on the threat, and some of the headlines that resulted were not necessarily more encouraging: “FCC: Companies Must Step Up to Improve Cybersecurity or Else... ” or “FCC Throws Down Cyber Gauntlet to Communications Industry.”
There are lots of potentially promising hints in Wheeler’s speech as well—especially in his emphasis on a truly critical question that is often overlooked by public and private actors alike: “How will we measure success or failure” when it comes to assessing cybersecurity efforts?
“This is the toughest and most important question that our stakeholders have to answer,” Wheeler said. He’s right about how tough and how important it is, but he’s wrong about it being the sole responsibility of the FCC’s stakeholders. Any kind of serious measurement endeavor will require considerably more data on the existing threats we face and the effectiveness of the defenses against them than we currently have. If the FCC, or any other government body, wants to drive things forward in this space, it should be thinking about whether there’s an active role for it to play in collecting, aggregating, and analyzing that data—not as a way to terrorize firms, but as a way to help them.
Government participation in this huge national challenge should be a promise, not a threat, and instead of vague paradigms and fuzzy rhetoric, it should be focused on cold, clear data.
This article is part of Future Tense, a collaboration among Arizona State University, the New America Foundation, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.