The Texas church shooting may revive the Apple-FBI iPhone encryption fight.

The FBI Can't Get Into the Texas Church Shooter's iPhone. Here's a Refresher on the Bureau's Fight With Apple.

The FBI Can't Get Into the Texas Church Shooter's iPhone. Here's a Refresher on the Bureau's Fight With Apple.

Future Tense
The Citizen's Guide to the Future
Nov. 8 2017 5:44 PM

A Quick Refresher on the FBI's Fight With Apple Over Encryption

Apple-Holds-Product-Launch-Event-At-New-Campus-In-Cupertino
Apple CEO Tim Cook (R) and chief design officer Jonathan Ive (L) look at the iPhone X, which has new facial recognition capabilities.

Justin Sullivan/Getty Images

Round One of the FBI’s battle with Apple over iPhone encryption ended in 2016 when the bureau found a workaround. But we may be about to see Round Two.

This time, the battle is over unlocking the iPhone of Devin Kelley, the shooter at the Sutherland Springs, Texas, church. The Washington Post reports that, unable get into the shooter’s phone to uncover more about his motive, the FBI flew the device to its forensics lab in Quantico, Virginia, where it is investigating alternate pathways to the phone’s data, such as cloud storage backups or linked devices. Like San Bernardino shooter Syed Farook, Kelley used an iPhone, and if the FBI’s initial efforts to access its contents come up empty-handed, it’s possible the government will re-litigate the 2016 court battle over whether Apple has an obligation to help law enforcement break into Farook’s phone.

Advertisement

Fuzzy on the details of this whole government-Apple faceoff? Here’s a brief refresher on the encryption debate.

First of all, what is encryption? In a 2015 overview written for Slate, Danielle Kehl explained:

Encryption is the process of combining the contents of a message (“plaintext”) with a secret password (the encryption “key”) in such a way that scrambles the content into a totally new form (“ciphertext”) that is unintelligible to unauthorized users. Only someone with the correct key can decrypt the information and convert it back into plaintext.

Using codes to communicate sensitive information is nothing new—it’s been around for millennia—but encryption breakthroughs in the 1970s cleared the way for today’s data protection. Today’s iPhones use 256-bit AES key encryption, which means that each device has a randomly generated, unique key that is one of a nearly unfathomable number (the exact figure is 78 digits long) of possible patterns and therefore virtually impossible to guess. Apple doesn’t keep a copy of this key, so the only way to use it to unscramble data that’s only on your phone is to enter your personal passcode.

iPhone users are able to set alphanumeric passcodes as well as four or six-digit passwords. Apple has also added Touch ID (starting with the iPhone 5s in 2013) and now facial recognition software, both of which allow users to forego manually entering a passcode in many situations. The more complex the passcode, the harder to hack into a phone, and this problem is compounded by tiered time delays that kick in after a certain number of incorrect passcode entries: one minute until you can try again after five wrong attempts, a one-hour wait after nine tries, etc. Users can also set their phones to erase all data after 10 consecutive wrong attempts. Together, these security measures present serious obstacles for brute-force attacks—that is, inputting every possible passcode.

The impressive security of your standard iPhone poses a problem for law enforcement. When the bureau couldn’t crack Syed Farook’s iPhone 5c after he and his wife, Tashfeen Malik, killed 14 in a terrorist attack in 2015, a federal magistrate judge used the All Writs Act of 1789 to order Apple to build software that would make it easier for the FBI to unlock the device without risking an erase of data. But Apple refused. As CEO Tim Cook wrote in a letter, “We fear that this demand would undermine the very freedoms and liberty our government is meant to protect” because the “backdoor” exception that would help potentially help law enforcement learn about any contact between Farook and ISIS could, if stolen, be exploited by hackers. Civil rights groups like the ACLU and Amnesty International voiced their support of Apple’s stance.

But before the case could end with a loaded legal precedent, the FBI paid $900,000 to an undisclosed third party that helped them bypass the phone’s iOS 9 security.

Phone encryption has remained a point of frustration for law enforcement, however. In October, FBI Director Christopher Wray said that in an 11-month period, his agency had been unable to access half of the 14,000 devices they’d targeted. Deputy Attorney General Rod Rosenstein made similar remarks that month when he called for “responsible encryption” at the U.S. Naval Academy. Now, in the wake of another mass shooting and so-far-inaccessible phone, it looks like the debate over encryption won’t end anytime soon.

Future Tense is a partnership of SlateNew America, and Arizona State University.