Future Tense

The Real Lesson of the OPM Breach

President Barack Obama chats with Chinese President Xi Jinping on Sept. 24, 2015, in Washington, D.C.  

Photo by Rod Lamkey-Pool/Getty Images

As Chinese President Xi Jinping returns home from Washington, a cybersecurity “understanding” has been announced between China and the United States. Unfortunately, it’s unclear whether the understanding would have prevented cybershenanigans such as the likely Chinese copying of more than 21 million records from the U.S. Office of Personnel Management. In that one incident alone, a foreign government may have copied all personnel data for every U.S. federal employee and retiree: their families, psychological profiles, financial profiles, college roommates, and beyond. They also seem to have gotten the fingerprints for more than 5 million people. The full implications of the breach may take years to understand.

In response, the U.S. government initiated a “cybersecurity sprint” to help federal agencies upgrade their digital security and pursued a diplomatic accord with the Chinese government. But the diplomatic discussions overlook a deeper management issue: Why was the Office of Personnel Management running its own servers in the first place?

The Office of Personnel Management has only roughly 7,000 employees, constituting less than 0.3 percent of the total federal government. Its mission is “recruiting, retaining and honoring a world-class force to serve the American people”—there’s nothing remotely high-tech in that. Why, then, was OPM asked to manage its own data servers? Why should it be tasked with developing deep in-house expertise on data security? Why was it put in a position where it was too hard to succeed and where the failure created such a large issue?

If the Office of Personnel Management were a company, this story would have played out very differently. OPM would almost certainly have outsourced its data management to a cloud provider that specializes in securely managing data, and those professionals would have done what they do best. The buzzwords “cloud computing” and “managed servers” are all about companies moving to that model, which is more secure and less expensive than running your own servers. Netflix and Dropbox, for example, generally do not maintain their own servers but rather run on computers managed by Amazon Web Services. So OPM would not have had to manage its own core data-management program—a mission for which it was not equipped—and instead could have given that task to professionals.

Now, government has unique needs, and perhaps a commercial vendor like Amazon is not always the right fit for federal agencies. Private providers have been compromised in the past (though they have made great strides), and there’s an argument for government to manage their critical data in house. But this, too, is a solvable challenge.

A single government tech services team could support many federal agencies by managing their servers, protecting their data, and ensuring robust security practices. The work would be centralized, with unified standards and the ability to more easily make enhancements that would apply across other agencies. For example, if an issue like the Heartbleed bug is uncovered, the centralized team could update all its managed servers at once, instead of waiting for each agency to update itself.

Such a tech team wouldn’t need to reinvent the wheel. Many federal agencies, including the Social Security Administration and the Commerce Department, already have some solid data-management techniques. One of those teams could perhaps be resourced and empowered to scale its work into a service for other agencies. Or a new “Office of Technology Management” could be built from scratch, with personnel carefully recruited from the most successful federal agency teams. Or a commercial provider could be certified and given responsibility for the task at scale.

Adopting this approach doesn’t mean an agency loses control of its own technology. The data management can be centralized while agencies can still deploy their own technologies to interact with the data—a common design pattern in Web development. Of course, this works only if the centralized bodies are thoughtful and solicit frequent feedback from participating agencies. And the consequences of a future breach would be even harsher with all the eggs in one basket, though the basket would be fortified as never before. After all, there’s a reason gold bars used to be stored, centralized, in Fort Knox.

The government still has time to get this right. As far as we know, vast amounts of sensitive data are currently still private—on our economic policies, our national security strategies, or domestic projects large and small. Federal agencies are responsible for managing those projects, but they do not need to be responsible for managing their own data servers. And, given the stakes, they shouldn’t be. It’s just a question of centralizing technology management.