Can the Government Force Suspects To Decrypt Incriminating Files?
Courts must determine whether the Fifth Amendment applies to encryption.
Photograph of computer by Creatas and photograph of police tape by iStockphoto.
In October 2010, law enforcement agents pursuing a child pornography investigation tracked a Florida man suspected of sharing illegal images to a hotel room in California. After obtaining a search warrant, they raided the room, seizing computers and hard drives with nearly five terabytes of total storage capacity. However, they soon hit a roadblock: Portions of the hard drives had been encrypted and were unreadable without a password. The suspect refused to decrypt the drives, and a federal district court in Florida held him in contempt and ordered him incarcerated. However, late last month, a federal appeals court overturned the contempt holding, ruling that the suspect’s refusal was protected under the Fifth Amendment right against self-incrimination.
What happens when the government’s desire to access a suspect’s encrypted electronic documents runs up against the Fifth Amendment? As with so many of today’s technology-related constitutional questions, the answers are complex, evolving, and sometimes contradictory. However, across the relatively small set of court rulings that have directly addressed this issue, a few key things stand out.
Courts have consistently held that defendants cannot be forced to divulge passwords. However, and more practically with respect to the end result, a defendant can sometimes be forced to use a decryption password—without divulging it—and then to provide the files in readable form. Whether the government can compel decryption in this manner depends on a legal doctrine called “foregone conclusion” that was first articulated in a 1976 Supreme Court ruling relating to paper documents in a tax fraud case.
Under the “foregone conclusion” doctrine as applied to digital documents, handing over files is not considered testimony if the government already knows that the files exist and what machines they live on. And when there is no testimony, the protection of the Fifth Amendment’s self-incrimination clause is not available. Prosecutors with specific information about the existence and location of files on encrypted hard drives are more likely to convince a court to order a suspect to decrypt them.
In another child pornography case, officials at a Vermont border crossing inspected a laptop in a car entering the United States from Canada. Upon seeing filenames suggesting illegal images, they seized the computer and arrested its owner. The laptop turned out to be encrypted, and in February 2009 a federal district court judge ordered the defendant to reveal its contents, largely on the grounds that the government already knew it contained incriminating files. The defendant complied and was later convicted.
By contrast, an ongoing mortgage fraud case in Colorado involved a more nuanced set of issues. Investigators seized an encrypted laptop and subsequently recorded a phone conversation in which the defendant suggested that it contained incriminating files. In January, a judge ordered the defendant to decrypt the laptop’s hard drive—but also acknowledged that the investigators did not know the “specific content of any specific documents” that might be found. The order became effectively moot in late February when authorities found a way to decrypt the drive without the defendant’s help.
Let’s return now to the Florida man who refused to decrypt his seized hard drives. In that case the government suspected, but did not know with certainty, that the hard drives contained incriminating files. As Judge Gerald Bard Tjoflat, writing for a three-judge panel of the 11th U.S. Circuit Court of Appeals, explained in the decision, “We find no support in the record for the conclusion that the Government, at the time it sought to compel production, knew to any degree of particularity what, if anything, was hidden behind the encrypted wall.”
This ruling has been hailed as a victory for constitutional rights, and in a sense, it is. But there is also a potential dark side that we would be remiss not to acknowledge. Do we really want to provide terrorists and human traffickers with an impenetrable legal shield for documents that might otherwise incriminate them? Is the greater good really served if a rape or murder suspect escapes conviction because he hid evidence—for example, digital maps of a victim’s address—behind encryption? Could this legal framework allow encrypted, illegal images of children to be stored and exchanged with impunity?
These questions illustrate the contemporary challenges of determining the scope of the Fifth Amendment. It was ratified in 1791 and now is being applied, with the aid of a 1970s-era legal precedent, to 21st-century digital encryption. In the pre-digital age, there was a distinct boundary between the information that resided only in our minds and the information that we committed to paper. The former was afforded strong constitutional protection; the latter, much less so. But modern encryption blurs that boundary by enabling the storage of essentially infinite amounts of information that can be unlocked only by passwords stored in our minds. (If only all criminals hid Post-Its with their passwords under their keyboards.) Put another way, encryption creates the possibility that our digital data and devices will be viewed, in the legal sense, as extensions of ourselves.
It is, of course, too early to know what the Supreme Court will say on this matter. But at some point, it will weigh in. And when it does, what is the proper way to handle the intersection of encryption and the Fifth Amendment? The solution will probably require updating the “foregone conclusion” doctrine. In particular, its requirement related to the location of incriminating documents is not well matched to a world with billions of electronic devices, and in which cloud computing is rapidly becoming the norm. Instead, a requirement that the government must be able to show possession of incriminating documents before being able to compel their decryption might be more appropriate for the 21st century.
This article arises from Future Tense, a collaboration among Arizona State University, the New America Foundation, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.
John Villasenor is a nonresident senior fellow at the Brookings Institution and a professor of electrical engineering and public policy at UCLA. Follow him on Twitter.