Future Tense

How Apple’s Stand Against the FBI Could Backfire

Tim Cook may have chosen the wrong time to fight the government on its request for access to a San Bernardino shooter’s iPhone.

San Bernardino Shooters.
Handout photos provided by the FBI show Syed Rizwan Farook and Tashfeen Malik at unspecified dates and locations. Apple may have picked the wrong case to fight the FBI on.

FBI via Getty Images

The skirmish between Apple and the FBI is quickly escalating to a battle royal, a fight to the finish over lofty principles and national values, involving not just this company and this bureau but all of Silicon Valley and the entire realm of U.S. intelligence gathering.

Certainly the two combatants are presenting the case in these terms, Apple and its supporters declaring that the future of encryption and privacy rides on the outcome, the FBI (as well as the National Security Agency and several police chiefs) claiming that well-trod paths to capturing criminals and terrorists will be closed off if the computer giant gets its way.

Both sides are overstating the stakes; both sides are making disingenuous arguments. Yet I think Tim Cook, Apple’s swaggering or courageous chief executive, has miscalculated. The dispute he’s chosen to challenge in the courts—which seems destined for appeal all the way to the Supreme Court—is a weak test case, from his vantage. And in the unlikely event the justices hand him a victory, he’s likely to trigger a political fight that he and every ideal he stands for will probably lose.

The gist of the case is that the FBI wants full access to the iPhone 5C used by Syed Farook, one of the San Bernardino, California, shooters. The bureau isn’t asking Apple to unlock the phone. Owing to its software, which allows users to set their own security code, Apple’s engineers can’t unlock it even if they wanted to. In a clever workaround, the FBI has asked Apple to override the feature that wipes out all of the phone’s data after someone enters an incorrect passcode 10 times. With that obstacle lifted, the FBI (or some other government agency) could use “brute force” methods—applying software that can generate thousands of alphanumeric guesses per second—to break the code.

Apple refused to cooperate with the FBI, so the bureau took it to court, where a magistrate judge ordered the company to comply. On Tuesday, Cook wrote an open letter to customers, arguing that the government’s campaign has gone too far. By ordering Apple to write new software that makes one of its own systems vulnerable, the government is making all Apple systems, everywhere, vulnerable. Someone could duplicate the software. “In the wrong hands,” Cook wrote in his open letter, “this software—which does not exist today—would have the potential to unlock any iPhone in someone’s physical possession.”

FBI officials have countered that they are not asking for Apple to carve out a “back door” into all Apple phones.  Their request would affect only this one phone. If Apple took precautions in applying the software, it wouldn’t necessarily leak out.

A retired intelligence official who dealt with software security outlined one way that this could be done. Apple’s engineers, he said, wouldn’t have to make elaborate changes to the code or operating system. Rather, it could simply reset the existing security feature, so that, instead of wiping out the phone’s data after 10 incorrect passcodes, it did so after 1 million tries or 10 million or whatever number was necessary.

More than that, he continued, Apple’s engineers could reset the feature in their own lab, through hard-wired controllers, with no government official present. They could even do the brute-force code-breaking themselves (password-sniffing software is commercially available), after which they could hand over the contents that the FBI wants, then destroy the phone. Problem solved, no leakage.

I asked a few other retired intelligence officials, and some computer-security specialists, whether this plan was feasible. They thought it was. I presented the idea to a senior executive at a software-security company who fervently endorses Apple’s side in the case. He too thought this might be a reasonably safe way to handle the data.

However, this executive, like Cook, is still disturbed by the larger issues of the case. Being compelled to unlock a phone, so the government can look inside, is one thing, he said. Being compelled to alter or write new software, in order to undermine a device’s security, is another matter entirely.

But are the two so different? As Shane Harris reported in the Daily Beast, Apple has unlocked phones, at the government’s request or under court order, at least 70 times since 2008. In doing so, Apple implicitly accepted the principle that the government has the right, under court-approved circumstances, to get inside Apple-made phones. Cook designed the iOS8 operating system in 2014 precisely to evade further requests: Under the new system, the user sets the code, so if the government asks Apple executives to unlock a phone, they can honestly say they can’t. Now the FBI has devised a way around the problem by asking Apple to shut off the data-wipe feature, so the phone can be unlocked with brute force. The technique is different, but the outcome—letting the government into a phone designed by Apple—is the same. Cook may have changed his mind about the government’s right to his products’—his customers’—contents; he may regret ever cooperating in the first place. But that doesn’t negate the fact that Apple accepted the principle in the past, and the company’s identity and ownership haven’t changed in the interim.

Another weakness in Cook’s argument, from a legal point of view, is that the phone was bought by Farook’s employer, the San Bernardino County’s Department of Public Health. And county officials—the phone’s owners—have consented to an FBI search of the phone. In other words, as George Washington University law professor Orin Kerr argued in a balanced, well-reasoned analysis, “There are no Fourth Amendment rights in this case.”

There is one possible legal opening for Apple. The government’s case rests on the All Writs Act, passed in 1789, which has frequently been invoked to justify wiretap orders. Recent Supreme Court rulings have exempted companies from complying with these orders if they have “distance” from the case in question, if compliance would impose an “unreasonable burden,” or if their cooperation is unnecessary.

Apple’s attorneys have argued that the company does have “distance” in cases such as this, because it has no interest in the phone once it’s sold. This argument seems thin, given Apple’s past cooperation in unlocking 70 phones. But does rewriting a security code impose an “unreasonable burden”? Neither I nor many other people outside Apple can say for certain. The term is deliberately loose. If Apple’s lawyers can’t prove a physical burden, they might argue that compliance would impose a burden on the company’s reputation—its brand and, therefore, its revenue—as a stern protector of customers’ privacy. Again, the 70 prior cases put a damper on the argument, but a good lawyer might make a more convincing argument than I’ve anticipated on the distinction between unlocking a phone and undermining a security code.

But the third exception is most interesting: Does the FBI really need this phone? As has been reported, the FBI already obtained the phone’s “metadata”—the record of phone numbers it called, on what dates, at what time, and for how long—from the cellular service, which stores this material routinely. If Farook talked with foreign terrorists from this phone, it would show on the metadata. In fact, though, Adm. Michael Rogers, the NSA director, has said the metadata revealed no foreign connections. (Farook might have talked to foreigners on one of the two other phones he had, but he destroyed them before the shooting, so no one knows.)

It’s not clear what the FBI is looking for, but it doesn’t really matter. This was a phone used by a mass murderer who’d expressed allegiance to a foreign terrorist organization. A thorough search through every closet, back room, and, yes, telephone is justifiable.

In any case, the FBI doesn’t seem to be in a rush. If the matter were urgent, other government agencies, notably the NSA, could have obtained a Foreign Intelligence Surveillance Act court order or permission from the attorney general to crack the phone in a number of exotic ways. There are several companies that employ gray-hat hackers who can do amazing things to seemingly impenetrable hardware and software.

Richard Clarke, the former White House counterterrorism and cyber policy chief under Presidents Bill Clinton and (briefly) George W. Bush, wrote on his Facebook page Thursday:

On the Apple v FBI debate, two possibilities to consider: 1) NSA could break in to the phone, but the FBI is trying to make Apple do it to establish a legal precedent and 2) FBI does not really need this phone for its investigation, but thinks the San Bernardino attack was so well publicized that it will gain public sympathy for the FBI’s position on encryption by using the emotionally charged case to make its point.

There might be something to this. Bloomberg Business Week reported Friday on a secret “decision memo,” drafted by the National Security Council in late 2015, instructing agencies to find ways to counter strengthened encryption systems—such as those in Apple’s new iPhones—including opportunities to change existing laws.

The Apple case on the docket now might well be a golden opportunity, and Tim Cook may have fallen into the administration’s trap. Cook invokes principles of privacy, and clearly, he believes in them deeply. On the other hand, Donald Trump, speaking of the Apple executives’ resistance to an FBI investigation of a mass-murdering terrorist, yelled, “Who do they think they are?” And it’s a fair bet that a majority of Americans, even those who think little of Trump, would agree with this sentiment.

Certainly a majority of politicians would, especially in an election year. If Apple wins this court case, someone in Congress will no doubt introduce a bill that greatly expands the government’s powers in this area and severely restricts the privacy rights of hardware and software companies. And in the present climate, there’s a good chance the bill will pass. This may be why so few of Cook’s colleagues and competitors—many of whom have government contracts—haven’t supported his position more than tepidly, much less rallied to his battle cry.

One former cyber official, who agrees with Apple’s position, told me, “I don’t understand why Cook didn’t just cooperate very quietly. This case seems pretty clear-cut. His resistance doesn’t have what they call ‘good optics.’ ” Cook, or anyway his allies in the industry, may soon come to feel the same way.

Read more from Slate on the Apple-FBI battle.

This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, follow us on Twitter and sign up for our weekly newsletter.