A revolution is afoot in privacy regulation. In an assortment of white papers and articles, business leaders—including Microsoft—and scholars argue that instead of regulating privacy through limiting the collection of data, we should focus on how the information is used. It’s called “use regulation,” and this seemingly obscure issue has tremendous implications for civil liberties and our society. Ultimately, it can help determine how much power companies and governments have.
You are probably familiar with privacy laws that regulate the collection of data—for example, the military’s famous “don’t ask, don’t tell, don’t pursue.” When you interview for a job, the employer should not ask you about your religion, your plans to have children, or whether you are married. There’s also the national movement to “ban the box” to stop collection of arrest and old conviction data on job applications.
In a use-regulation world, companies may collect any data they wish but would be banned from certain uses of the data. In U.S. law, a good example of use regulation comes from credit reporting. Your credit report can be used only for credit decisions, employment screening, and renting an apartment. Or consider your physician: Her professional norms encourage expansive data collection, but she can use medical records only to advance patient care.
Bans on data collection are powerful tools to prevent institutions from using certain knowledge in their decision-making. But advocates of use regulations have some compelling points: Collection rules are too narrow by themselves. They ignore the real-life problem that we just click away our rights for the newest free service. And, increasingly, technologies gather data with no realistic opportunity to give notice to the individual at all. Some of these technologies can be used to infer knowledge about the very issues collection limitations attempt to protect. For instance, consider the Target Corporation’s ability to infer that a shopper was pregnant when she went from buying scented to unscented lotion. Use regulations shift the pressure away from notice and choice, making a more universal set of rules for data.
Craig Mundie is a longtime Microsoft technologist and member of the President’s Council of Advisors on Science and Technology, which recently released a set of new recommendations for the use of big data, wrote. In July, Mundie wrote in Foreign Affairs that use regulations are a “pragmatic” solution to the “data exhaust” generated by new products and services. But such a label shifts critical focus away from technology companies that define and control the space we have for “pragmatic” decisions. Technology companies have designed products to produce a data exhaust, often deliberately. Unreadable privacy notices and the inconvenient choice mechanisms were created by the very companies that want to encourage data sharing. Technologists designed systems to make privacy impossible, and now they say it is “pragmatic” to accept their legal proposal to solve the privacy problem.
There are deep problems with protecting privacy through regulating use. When one takes into account the broader litigation and policy landscape of privacy, it becomes apparent that use-regulation advocates are actually arguing for broad deregulation of information privacy.
Initiatives to promote use regulations say they will maintain collection limitations, but the proposed restrictions are hollow. One leading proposal to reform privacy laws promoted by Microsoft would allow any collection of data that is not already illegal—giving business free rein to condition access to services on users giving up data. As companies sell devices for the home, they will be uninhibited in what they collect, even if the data are unnecessary for the service.
Freed from any kind of data diet, corporate databases become attractive to governments. Any data obtained by the private sector can be obtained by the government, sometimes through voluntary information sharing.
Government collection of data would be equally voracious under a use-regulation regime. This should come as no surprise, because technology companies benefit from big government data. Information-intensive companies promote government power through a form of data laundering. They lobby to encourage the government to collect more personal information, and then argue that the personal data should be released under open government laws.