As federal law enforcement and intelligence agencies continue to lobby for access to encrypted information on the devices we use every day, they are coming up with more, and more radical, legal arguments. In the most brazen current example, the Justice Department asserts that since device “sellers” like Apple only license the software, they can be required to divulge the contents of data on the “buyers’ ” phones.
If law enforcement wins with this logic, the consequences will be far-reaching. But one foreseeable, if almost certainly unintended, effect would be to give a huge boost to the open-source and free software movements. Before I explain why, here's some background.
The case at hand, in which the government seized an iPhone during an investigation into alleged drug distribution, is not directly about law enforcement's ardent wish that technology companies build backdoors into their software and devices. In this case, a backdoor of sorts already exists, because it involves an iPhone running Apple's older iOS 7 mobile operating system. Apple can unlock those devices even if it doesn't know the passcode. (With newer iPhones running iOS 8 and above, and with many newer Android devices, neither Apple nor Google can unlock the phones without the customers' cooperation.) The company has said the government's frequent requests have imposed a burden on its staff and have harmed its reputation with customers. And a magistrate judge has pointedly asked the Justice Department to show that the ancient law, called the All Writs Act, under which it's seeking Apple's cooperation, even permits such orders.
The DOJ has insisted it does have the right and has escalated the battle. Its latest legal brief says that a) Apple licenses the software to customers even if it sells the phones; b) “this software is thwarting the execution of the warrant”; and c) the company “cannot reap the legal benefits of licensing its software in this manner and then later disclaim any ownership or obligation to assist law enforcement when that same software plays a critical role in thwarting execution of a search warrant.”
Think about the implications of this. Almost all modern technology devices include software that has this kind of license. Meanwhile, more and more of the objects we touch and use every day have software and networking built into them. Almost all of them grant licenses to use the software, not actual ownership of anything but the hardware.
As keen technology and security observer (and friend) Cory Doctorow observes: “If the DoJ establishes the precedent that a product's continued ownership interest in a product after it is sold obliges the company to act as agents of the state, this could ripple out to cars and pacemakers, voting machines and tea-kettles, thermostats and CCTVs and door locks and every other device with embedded software.”
Moreover, Jennifer Granick and Riana Pfefferkorn of the Stanford Center for Internet and Society note that “the blow to users’ trust in their encrypted devices, services, and products would be little different than if Apple and other companies were legally required to design backdoors into their encryption mechanisms (an idea the government just can’t seem to drop, its assurances in this brief notwithstanding).”
Maybe this is a stretch, but I can imagine a bright side to this latest overreach by the government—at least for people who believe in the value of software that doesn't have such onerous legal restrictions on its use. The more the government insists that it has special access rights to commercial software—and the more it lobbies for commercial vendors to install backdoors—the more likely it may be that technology users move to software and devices that by definition can't be owned this way.
It's widely known that the NSA asked Linus Torvalds, who's best known for his stewardship of the GNU/Linux operating system project, to install backdoors in Linux. By all accounts he refused. Since the Linux programming code is open to inspection, such a modification in a widely vetted system would almost certainly be spotted at some point.
I use the Ubuntu flavor of Linux on my laptop computer, the open-source Cyanogenmod on my primary phone, and as much nonproprietary applications software as possible on both. I do this in part because I believe in more control for the user and less control by the companies that sell me the hardware and software. But as governments insist on turning vendors into spies, I'll do what I can to use software that's owned by the community, not those vendors.
Open-source and free software is not a bulletproof solution, of course. The notorious “Heartbleed” security flaw that affected millions of OpenSSL users apparently stemmed from a programming error that no one noticed for far too long. But it was noticed, eventually.
Nothing can be assumed to be perfectly secure, but the government can be assumed to be relentless in its drive to have access to everything we say and do. As law enforcement insists that vendors are its pawns because they own the software on our devices, I can't think of a better advertisement for community-built software.