Future Tense

Bankers Use Encryption, Too

A (mostly) good congressional report highlights the need to broaden the encryption discussion beyond terrorism.

U.S. House Homeland Security Committee Chairman Rep. Michael McCaul delivers a "State of Homeland Security Address" at the Arnold Auditorium of the National War College on Fort McNair December 7, 2015 in Washington, DC.
Texas Rep. Michael McCaul delivers an address in 2015 at the National War College on Fort McNair in Washington, D.C.

Alex Wong/Getty Images

It turns out banks rely on encryption. That’s one of the fairly obvious, yet important observations in a report the House Homeland Security Committee put out in late June. Forty-four percent of the encryption market serves the financial services industry, the report explains, which in turn helps make it possible for 51 percent of U.S. adults to bank online.

In the last two years, as part of the Crypto War 2.0 over whether the government should have back doors into software, the degree to which our economy relies on encryption has often been forgotten. This report reminds policymakers that the encryption debate goes well beyond whether the FBI can break into iPhones: It affects whether we can bank, obtain health advice, or shop online.

Republicans on the House Homeland Security Committee completed the report—billed as a “primer” on the debate—over the past year. Based on interviews and meetings with more than 100 people, the report presents a broader spectrum of views, including those of affected industry, than most discussions of encryption have done in recent years. As committee chairman Rep. Michael McCaul explains, Encryption is too central to our country’s future to answer without a robust dialogue with all the key stakeholders.”

The report presents three prospective ways forward: a bill from California Rep. Ted Lieu that would prohibit states from prohibiting encryption; a bill from Senate Intelligence Committee leaders Richard Burr and Dianne Feinstein that would effectively mandate backdoors nationally; and McCaul’s preferred solution—the creation of an encryption commission—as a “just right” lukewarm porridge between those other competing options: “This approach recognizes that equities on all sides of the encryption debate should be taken into consideration.”

The call for consultation with stakeholders beyond the intelligence, law enforcement, and privacy communities is all well and good. But McCaul’s report-justifying-more-reports discredits his promise that a commission will lead to real understanding about encryption.

That’s because from the very first 36 words, the report makes unsubstantiated, factually incorrect claims about terrorists’ use of encryption: “Public engagement on encryption issues surged following the 2015 terrorist attacks in Paris and San Bernardino,” the report claims, “particularly when it became clear that the attackers used encrypted communications to evade detection—a phenomenon known as ‘going dark.’ ”

The statement has grains of truth to it. It is true that engagement on encryption surged following the Paris attacks, largely because intelligence community sources ran around saying (and probably briefing the White House) that encryption must explain why no one realized the attack was coming. It surged further months later when FBI chose to pick a fight with Apple over San Bernardino shooter Syed Rizwan Farook’s work phone, which—as was clear from the start—had no evidence relating to the attack on it. In truth, though, Crypto War 2.0 started when FBI Director Jim Comey, in October 2014, raised the specter of “Going Dark” in response to Apple rolling out default encryption on its iPhones, and that the surge in attention must be linked to a desire to withhold default encryption from average consumers.

It is also true that ISIS had been using the messaging app Telegram leading up to the Paris attack; in its wake, the social media company shut down a bunch of channels tied to the group. But there has never been a public claim the plotters used Telegram to plan their attack.

It is also true that an ISIS recruit who was arrested and interrogated months before the Paris attack told French authorities he had been trained to use a Truecrypt key and an elaborate dead drop method to communicate back to Syria.

But it is not true that the Paris attackers used encryption to hide their plot. They used a great many burner phones, a close-knit network (and with it, face-to-face planning), and possibly an unusual Moroccan dialect. One plotter’s phone had an encrypted product loaded on it, but it was not using that service.

It is also not true that the San Bernardino attackers used encryption to evade detection. They used physical tools to destroy the phones presumably used to plan the attack. They hid a hard drive via some other, still-unidentified means. But the only known use of encryption—the encryption that came standard on Farook’s work iPhone—was shown, after the FBI paid to bypass it, not to be hiding anything at all. (Now, it’s possible McCaul’s committee has been briefed on encryption involved in these attacks. But “[p]ublic engagement on encryption issues” couldn’t have “surged” after “it became clear that the attackers used encrypted communications to evade detection” if the public wasn’t told about it.)

The report also fails to recognize the degree to which encryption is about criminal enforcement, not hunting terrorists, something Comey himself has admitted in congressional testimony. Indeed, one of few pieces of data on encryption and law enforcement the report cites—an American Civil Liberties Union report on All Writs Act requests—notes that “investigations into drug related crimes appear to be the leading cause of AWA motions.”

The report also doesn’t consider what happens when law enforcement encounters encryption. “[T]he Office of the District Attorney for New York County reported that investigators struggled with more than 175 cases between September 2014 and March 2016 because they lacked access to digital information,” the report says, repeating one of the most-cited claims about encryption. It doesn’t unpack what “struggled” means in this context, even though other reporting shows that several of these cases still resulted in conviction.

These details matter for several reasons. If law enforcement still manages to find and convict criminals even as more consumers make use of Apple’s default encryption, then the encryption debate is not as urgent as Comey and others have suggested. Those attacking encryption emphasize terrorism so they can maximize fear about the technology. But in reality, encryption is used far more frequently for relatively mundane law enforcement cases or, more importantly, for banking and other legitimate purposes. That changes the debate dramatically.

By all means, the debate on encryption should include more stakeholders than have been included thus far, and this report is worthwhile for making that case. But by spooning up sensational claims about terrorism, the House committee undermines the very point it tries to make.

This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, follow us on Twitter and sign up for our weekly newsletter.