In other words, knowing exactly who you are is important to Shopkick’s honchos, and they will seek that information out.
Once Shopkick has your data, what does it do with it? More importantly, who does it share it with? Shopkick’s policy says, “We may also share this Non-Personally Identifiable Information with our Affiliated Partners.” It’s not clear what an
“Affiliated Partner” is, though.
Second, when an app or website shares or sells your data, that data enters the hands of another company whose privacy policies we don't know. Those third parties might be stores that can make us useful offers based on our data. But what if the third party tells insurance companies how much time you spend shopping in your local tobacco shop, liquor store, or marijuana dispensary? Since there are no clear restrictions on the companies that receive our data, we are left to wonder how they use it.
Finally, most privacy policies are extremely misleading about “non-personally identifiable information.” Bits of data that feel anonymous are anything but when taken together. The Electronic Frontier Foundation provides an excellent overview of academic research that shows these traits can uniquely identify a vast majority of the U.S. population. For example, the combination of ZIP code, birthdate, and gender—all “non-personal information” that Shopkick and many other services collect—is unique for about 87 percent of U.S. residents. That means if an app has this data, it usually has enough that you can be individually identified.
The vast majority of companies creating these apps are not malicious. They are trying to provide a service that people find valuable, and there is nothing inherently wrong with profiting from that. Our data helps them provide these services to us, and it helps them make money.
Right now, users are deciding to share their data based on an analysis of the benefits. But that decision should come from an analysis of the risks, too. That requires transparent and informative privacy policies, but the state of privacy policies today is quite the opposite. They are opaque and misleading. Before granting companies access to such critical components of our lives, we should know exactly what they will collect and how well it will be protected. Without that knowledge, we are forced to operate entirely on trust.
This article is part of Future Tense, a collaboration among Arizona State University, the New America Foundation, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.