There is one secret Edward Snowden spilled that even the most hard-boiled anti-privacy security operative can’t ignore: The National Security Agency is broken. It is broken not just because it somehow let a contractor steal a voluminous portion of its secrets, but because many of those secrets paint a picture of a dysfunctional agency. The NSA not only can’t stop a leaker like Snowden; it’s uncertain that the NSA can do a good chunk of what’s in its job description.
Yet the agency has many brilliant projects under its belt. First and foremost, there is Stuxnet, the computer worm virus that infected over half of Iran’s computers and made its way into their Internet-isolated nuclear facilities—via some unknowing peon’s thumb drive—to sabotage Iran’s uranium-enrichment centrifuges and set their nuclear development back by months if not years. Based on White House leaks, David Sanger has reported that the NSA and Israel’s Unit 8200 co-developed Stuxnet. From a technical standpoint, it is the single most brilliant computer virus I have ever seen, precision-targeted yet flexible enough to spread through (needed) multiple vectors.
The NSA has always been on the cutting edge of encryption, designing the widespread Secure Hash Algorithms and driving standardization of Suite B. I entirely believe that they may have made a cryptographic breakthrough allowing them to crack most encryption in use today. And their intelligence work within local combat contexts, as with the Iraq surge in 2007, has frequently been respectable.
The question, then, is how an organization capable of such technical brilliance could also fall prey to organizational incompetence such as giving unaudited root access to a sysadmin contractor like Edward Snowden. In light of the agency’s achievements and the default rah-rah attitude toward issues of national defense, it is tempting to give the NSA the benefit of the doubt when a Snowden happens. That would be wrong. As I saw in my time at Microsoft, great technical skill can exist uneasily within organizational structures that kill innovation.
Poor internal security is nothing new for the NSA, though it’s never suffered the humiliation of someone putting a Mickey Mouse sticker over his ID picture and waltzing in, as its British counterparts, the Government Communications Headquarters, did. That story comes from James Bamford’s 1983 book The Puzzle Palace, which also chronicles the 1980s-era paranoia at the NSA when polygraph tests of employees were instituted. Thousands of employees were “strapped to a machine and asked whether [they] have been selling secrets to the Russians or leaking information to the press,” Bamford wrote. Amazingly, the NSA still uses polygraph tests, despite decades of research showing their unreliability and their inadmissibility in court; NSA whistle-blower Russell Tice demonstrated last year just how easy it is to beat them. That the NSA has stuck with the polygraph smacks of desperation and the need to convince someone that security is being taken seriously, through pageantry rather than through effectiveness. It’s the same reasoning behind why we’re still taking off our shoes at airport security.
What the NSA appears to be, then, is a sclerotic organization with individual pockets of brilliance. Agencywide infrastructure appears to be the agency’s most difficult challenge, which would account for its admitted inability to process the dragnet of data it sweeps up. From all indications, from Dana Priest and William Arkin’s 2011 book Top Secret America to Snowden’s documents, the NSA has no trouble collecting petabytes of data, but is unable to organize it effectively. In the Washington Post in 2010, Arkin and Priest wrote, “Every day, collection systems at the National Security Agency intercept and store 1.7 billion e-mails, phone calls and other types of communications. The NSA sorts a fraction of those into 70 separate databases. The same problem bedevils every other intelligence agency, none of which have enough analysts and translators for all this work.” In Top Secret America, a senior intelligence official complains, “The data was outdated by the time it had arrived.” (This very Slate article will contribute to clogging the NSA’s data pipes.) In his 2010 book The Secret Sentry: The Untold History of the National Security Agency, Matthew Aid describes modernization plans that are constantly put off and an ever-increasing flood of information that the NSA is forever trying to get under control, even as it eagerly gulps down more.
The difficulty in information retrieval, or IR, the general theoretical field underpinning search engines, is primarily in the analysis and indexing of the data. It will not do just to have the raw data sitting around: It must be preprocessed en masse—using immensely complex algorithms—and filed away in a form amenable to being searched in exactly the ways the NSA wants to search it. Searching the raw data would be like trying to find a name in an unalphabetized phone book. For those who think Google just keeps duplicate copies of every Web page in their original form and scans them when someone does a search—I assure you, this is not how it works, ever. Every piece of intelligence needs to be analyzed, annotated, and classified as it is obtained, so that large-scale comparisons and analyses can be later performed with ease.
Google innovated brilliantly in indexing and analysis while under the constant threat of capitalist competition. The NSA has no competitors as such, which is one of many reasons why it trails Google quite badly. As law professor and technologist James Grimmelmann puts it, “The NSA has mountains of data and no clear sense of how to manage it effectively. The attitude is that there must be needles in there, and we'll figure out a way to find them later.”