iBeacon, Shopkick: Privacy policies for location-tracking apps aren’t clear enough.

This App Tracks You While You Shop

This App Tracks You While You Shop

The citizen’s guide to the future.
Jan. 28 2014 8:01 AM

Track Star

This app follows you while you shop—and it needs a clearer privacy policy.

(Continued from Page 1)

In other words, knowing exactly who you are is important to Shopkick’s honchos, and they will seek that information out.

Once Shopkick has your data, what does it do with it? More importantly, who does it share it with? Shopkick’s policy says, “We may also share this Non-Personally Identifiable Information with our Affiliated Partners.” It’s not clear what an
“Affiliated Partner” is, though.

Does the privacy policy tell us enough to make an informed decision about whether to use the app and hand over our data? Not really. It is vague and misleading (and also typical) on several important points.


First, it is hard to tell what data is being collected. Even though Shopkick’s privacy policy provides an extensive list of data it collects, a lot is left unsaid. For example, does Shopkick access your microphone and listen in any time it’s open? Is it monitoring what you say about a product when you look at it or try it on? Does it record that audio? Is that ever shared?

I reached out to Shopkick and a representative told me that, while the microphone is on when you have the app open, it does not listen to human voices. She explained the TV commercial initiative was old and that the microphone is currently used only to decode inaudible audio signals that are part of its in-store communication setup. The fact that it doesn’t listen to you talk is good, but the language of the privacy policy seems that it would cover audio eavesdropping. The fact that there is ambiguity about what Shopkick does and what it could do in the future is both problematic and common among privacy policies.

Second, when an app or website shares or sells your data, that data enters the hands of another company whose privacy policies we don't know. Those third parties might be stores that can make us useful offers based on our data. But what if the third party tells insurance companies how much time you spend shopping in your local tobacco shop, liquor store, or marijuana dispensary? Since there are no clear restrictions on the companies that receive our data, we are left to wonder how they use it.

Finally, most privacy policies are extremely misleading about “non-personally identifiable information.” Bits of data that feel anonymous are anything but when taken together. The Electronic Frontier Foundation provides an excellent overview of academic research that shows these traits can uniquely identify a vast majority of the U.S. population. For example, the combination of ZIP code, birthdate, and gender—all “non-personal information” that Shopkick and many other services collect—is unique for about 87 percent of U.S. residents. That means if an app has this data, it usually has enough that you can be individually identified.

The vast majority of companies creating these apps are not malicious. They are trying to provide a service that people find valuable, and there is nothing inherently wrong with profiting from that. Our data helps them provide these services to us, and it helps them make money.

Right now, users are deciding to share their data based on an analysis of the benefits. But that decision should come from an analysis of the risks, too. That requires transparent and informative privacy policies, but the state of privacy policies today is quite the opposite. They are opaque and misleading. Before granting companies access to such critical components of our lives, we should know exactly what they will collect and how well it will be protected. Without that knowledge, we are forced to operate entirely on trust.

This article is part of Future Tense, a collaboration among Arizona State University, the New America Foundation, and SlateFuture Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.

Jennifer Golbeck is director of the Human-Computer Interaction Lab and an associate professor at the University of Maryland.