Advertisers’ Next Target: Your Brainwaves

What's to come?
March 25 2013 5:49 AM

It’s Like They’re Reading My Mind

How next-generation apps will market your brainwaves.

(Continued from Page 1)
zombies license your brain.

Courtesy of Grady Johnson and Sean Vitka

The information promised by these devices could offer new value to developers, advertisers, and users alike: Companies could detect whether you're paying attention to ads, how you feel about them, and whether they are personally relevant to you. Imagine an app that can detect when you're hungry and show you ads for restaurants or select music playlists according to your mood.

But, as with data collected during smartphone use, the consequences for data collected through the use of BCIs reach far beyond mildly unsettling targeted ads. Health insurance companies could use EEG data to determine your deductible based on EEG-recorded stress levels. After all, we live in a world in which banks are determining creditworthiness through data mining and insurance companies are utilizing GPS technology to adjust premiums. With these devices in place, especially with a large enough data set, companies will be able to identify risk indicators for things like suicide, depression, or emotional instability, all of which are deeply personal to us as individuals but dangerous to their bottom lines.

These problems aren’t entirely hypothetical. In August, researchers at the Usenix Security conference demonstrated that these early consumer-grade devices can be used to trick wearers into giving up their personal information. The researchers were able to significantly increase their odds of guessing the PINs, passwords, and birthdays of test subjects simply by measuring their responses to certain numbers, words, and dates.

BCIs invoke serious law enforcement concerns as well. One company, Government Works Inc., is developing BCI headsets for lie detection and criminal investigations. By measuring a person's responses to questions and images, the company claims to be able to determine whether that person has knowledge of certain information or events (leading to conclusions, for instance, about whether that person was at a crime scene). According to one BCI manufacturer, evidence collected from these devices has already been used in criminal trials. Although the jury is still out on the reliability of these devices, as psychics, predictive psychology, lie detectors, and unreliable forensics have taught us: Voodoo convicts.

We don’t want to delay or block innovation—we’re excited for the day when we can open doors like a Jedi or play Angry Birds entirely with our minds. Or even better, when double amputees can. But now is the time to ask serious questions about who ultimately controls our devices, who has access to the data stored and collected on them, and how those data are ultimately used. Real-time data that reveal one's attention level and emotional state are incredibly valuable—to buyers, prosecutors, and us data cattle. The question remains as to whether it will simply be the next grain of personal information bought, taken, volunteered, or stolen in the name of more accurate advertising and cheaper services.

Advertisement

And it's evident that today's privacy standards are woefully inadequate. You might assume health privacy laws would offer protection for this category of sensitive data. But you’d be wrong. Those rules apply only to a select group of people and companies—specifically health care providers, health insurers, and those who provide services on their behalf. Thus, most companies' ability to use BCI data or sell it to one another—even real-time data from your brain—is essentially unchecked.

The Department of Justice's legal opinion is that law enforcement can access any data a user provides to its cellphone company without a warrant, even if no person at the service provider would regularly see that information. This includes passive background data, like the location of the cell towers used to complete a call. Despite their loss on a similar issue at the Supreme Court last year, the DOJ continues to argue that they can access months of a user’s location data stored by phone companies and further that they can compel these companies to provide prospective location information. The deliriously old statute controlling these law enforcement tools was passed in 1986 (though thankfully, legislators like Sen. Patrick Leahy, D-Vt., are trying to update it).

Tomorrow's Yelp may be interested in more than just your location. Your brainwaves might help it provide more appealing results, and we may volunteer that information without understanding the implications of the data collection. Prosecutors could turn around and use this information as circumstantial evidence: Johnny says he wasn't angry at the victim, but his brainwave forensics say otherwise. Do you expect the jury to believe the robot lied, Johnny? If our laws remain outdated when these issues begin to come up, this new, incredibly intimate data will be guarded just like our current data: not at all.

This article arises from Future Tense, a collaboration among Arizona State University, the New America Foundation, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.

Grady Johnson is a program associate and researcher at the Open Technology Institute at the New America Foundation. He focuses on issues of privacy, security, and spectrum policy.

Sean Vitka is the federal policy manager at the Sunlight Foundation. He holds a J.D. from Boston College Law School.

  Slate Plus
Slate Picks
Dec. 19 2014 4:15 PM What Happened at Slate This Week? Staff writer Lily Hay Newman shares what stories intrigued her at the magazine this week.