Tech Companies Help Make NSA Surveillance Possible—and They Can Help Stop It, Too

The Citizen's Guide to the Future
Nov. 22 2013 3:13 PM

Tech Companies Help Make NSA Surveillance Possible—and They Can Help Stop It, Too

186761957
Google Executive Chairman Eric Schmidt speaks at the Chinese University in Hong Kong.

Photo by PHILIPPE LOPEZ/AFP/Getty Images

When Google chairman Eric Schmidt complained recently about the tapping of the company's internal network traffic by U.S. and U.K. intelligence agencies, he certainly had a point. America's National Security Agency and Britain's Government Communications Headquarters were clearly abusing their powers. But Schmidt's complaints had a hollow ring, because Google's own policies have left it—and its users—too open to abuse.

Unless Google, Facebook, Microsoft, and the rest of the Internet "cloud" providers stop using our data for their own commercial purposes, and start offering users a way to keep it truly private, we have to assume our information has been compromised. Unless they create a supplemental business model that allows us to pay directly for services in return for genuine security, they are making clear that their commercial priorities trump our privacy.

Advertisement

That's one reason why you should be highly skeptical of piece from the New America Foundation’s Weekly Wonk that's gaining some traction. The author, Marvin Ammori, argues that the technology industry is getting unfair criticism for its part in government surveillance. (Disclosure: Ammori is a Future Tense fellow at the New America Foundation; Schmidt is the chairman of the NAF board; I’m a professor at Arizona State University; and Future Tense is a partnership of Slate, New America, and ASU. Life is complicated.) Ammori approvingly cites Schmidt's remarks and condemns the “vilification of the tech companies.” (One reason you might be skeptical is that Ammori is "an attorney advising companies including Google and Dropbox on surveillance matters," as the tagline notes.)

No one I know in the civil liberties arena fully blames the tech companies for the government's shredding of our privacy and fundamental liberties. We do put the primary responsibility squarely where it belongs: on the government agencies—including the CIA, as we've just learned, and probably others—that treat the Bill of Rights like a doormat in their zeal to know everything at all times.

What we  question is the industry's commitment to privacy and liberty when a) some companies, notably the telecoms, are complicit enablers of untrammeled surveillance, and others are outright arms merchants to the warriors against privacy; b) they don't fight back as loudly and visibly as we believe they should; and c) they continue to engineer their services and products in ways that make surveillance easier for governments.

Ammori is absolutely correct that we can't avoid creating a certain amount of data to use modern services. And I strongly agree that we need a combined push by everyone involved to reform the laws and policies that have created this situation. But when he implies, for example, that cloud storage services always need unencrypted access to what we store there, he's simply mistaken.

The NSA/GCHQ penetration of Google's internal network  was possible because Google hadn't been encrypting the data moving between servers on what it had assumed—wrongly, as it turns out—to be a private network. By most accounts, the company is collectively furious about the spies' invasions and is now moving fast to create encrypted tunnels for future traffic.

That's useful, but it’s not enough, because Google, like other tech companies, has what's called "plain text" access to the data we create when we use its services. That means our information is always available to the company, and then to anyone with enough government clout. As Google noted in its latest vague transparency report, government demands are growing fast.

So here's where matters stand today. First, if you use services from Google and other cloud companies, you have to trust the companies with all the data they collect about you. (I trust Google more than the others, but there's no reason to believe some future management won't change policies.) Then you have to trust that these companies will successfully prevent government surveillance abuses—the ones they know about, anyway—even when they're not allowed to tell us what they're turning over, or why. Google and other tech companies are challenging the secrecy, but there's no guarantee Congress or courts will do anything serious to limit current surveillance abuses.

What could Google et al do to reassure me? Based on the way they do business, not much. Their revenue streams exist because, as the modern cliché goes, you and I aren't the customers; we're the product. These companies collect, store, and massage our information to provide us better experiences, yes, but their business models depend on using that information to get money from other businesses, mainly advertisers.

Could Google and the other companies create business models that would work for all of us? They could at least try. They could encrypt all information flow so that even they couldn't read what's stored on their servers and then charge us for using their services. Two problems: People tend to avoid paying for things they can get for "free" (I use the word advisedly, obviously), and some of the services themselves would be more difficult to provide. If Google made Gmail truly secure, it wouldn't be able to make it so easy for Gmail users to search quickly and efficiently through years’ worth of messages.

More than any of the other big tech companies, Google has worked on security and deserves credit for its efforts. But until it offers email, storage, and other services that are also secure from Google—and there's not a hint that anyone in Mountain View is interested in this—we should recognize where we exist in the ecosystem: as commodities, not customers.

That's not the case with services that put users' privacy and security first rather than as an afterthought. Example: SpiderOak, an online storage system, says it can't decipher what people store there because it doesn't keep the keys. That sounds like a feature, not a bug. If only Dropbox offered the same feature.

Others are working on ways to help us disguise more of our communications. And Internet standards bodies are, at long last, taking seriously the need for end-to-end encryption of everything that moves online. There's a lot more to do, but at least there's some action.

I'd prefer to be a customer, a paying customer. Maybe that makes me an outlier in today's world, but in light of what we're discovering about the tech industry and the government it serves, willingly or not, the more I suspect I'm far from alone.

Future Tense is a partnership of SlateNew America, and Arizona State University.

Dan Gillmor teaches digital media literacy at Arizona State University. He is the author of Mediactive.