Future Tense

There’s No Such Thing as Innocuous Personal Data

Why you should keep your heart rate, sleep patterns, and other seemingly boring info to yourself.

90314770
Researchers can learn more than you might think from your heart rate or your step count.

Paul Marotta/Getty Images for Fitbit

It’s 2020, and a couple is on a date. As they sip cocktails and banter, each is dying to sneak a peek at the other’s wearable device to answer a very sensitive question.

What’s his or her heart rate variability?

That’s because heart rate variability, which is the measurement of the time in between heartbeats, can also be an indicator of female sexual dysfunction and male sexual dysfunction.

When you think about which of your devices and apps contain your most sensitive data, you probably think about your text messages, Gchats, or Reddit account. The fitness tracking device you’re sporting right now may not immediately come to mind. After all, what can people really learn about you from your heart rate or your step count?

More than you might think. In fact, an expanding trove of research links seemingly benign data points to behaviors and health outcomes. Much of this research is still in its infancy, but companies are already beginning to mine some of this data, and there’s growing controversy over just how far they can—and should—go. That’s because like most innovations, there’s a potential bright side, and a dark side, to this data feeding frenzy.

Let’s go back to the example of heart rates. In a study conducted in Sweden and published in 2015, researchers found that low resting heart rates correlated with propensity for violence. It’s unclear whether these findings will hold up to further investigation. But if the connection is confirmed in the future, perhaps it could be cross-indexed, introduced into algorithms, and used, in conjunction with other data, to profile or convict individuals, suggests John Chuang, a professor at Berkeley’s School of Information and the director of its BioSense lab. (Biosensing technology uses digital data to learn about living systems like people.) “It’s something we can’t anticipate—these new classes of data we assume are innocuous that turn out not to be,” says Chuang.

And in the absence of research linking heart rate to particular health or behavioral outcomes, we tend to have our own entrenched social interpretations of what a faster heart rate actually means—that someone is lying, or nervous, or interested. Berkeley researchers have found that even those assumed associations could have complicated implications for apps that allow users to share heart rate information with friends or employers. In one recent study currently undergoing peer review, when participants in a trust game observed that their partners had an elevated heart rate, they were less likely to cooperate with them and more likely to attribute some kind of negative mood to that person. In another study scheduled to be published soon, participants were asked to imagine a scenario: They were about to meet an acquaintance to talk about a legal dispute, and the acquaintance texted that he or she was running late. Alongside the text, that person’s heart rate appeared. If the heart rate was normal, many study participants felt it should have been elevated to show that their acquaintance cared about being late. The authors warn of the “potential danger” of apps that could encourage heart rate sharers to make the wrong associations between their signals and behavior. One app, Cardiogram, is already posing the question: “What’s your heart telling you?”

Suddenly, anyone who knows your heart rate may prejudge—accurately or not—your emotions, mood, and sexual prowess. “This data can be very easily misinterpreted,” says Michelle De Mooy, the acting director of the Privacy and Data Project at the Center for Democracy and Technology. “People tend to think of data as fact, when in fact it’s governed by algorithms that are created by humans who have bias.”

And it’s worrisome that companies, employers, and others could use such imperfect information. Most biosensing data gathered from wearables isn’t protected by the Health Insurance Portability and Accountability Act or regulated by the Federal Trade Commission, a reflection of the fact that the boundaries between medical and nonmedical data are still being defined. “Regulation can sometimes be a good thing, and sometimes more complicating,” says De Mooy. “But in this case, it’s important because of the different ways in which activity trackers are starting to be a part of our lives. Outside of a fun vague activity measure, they are coming into workplaces and wellness programs in lots of different ways.”

Take Aetna’s attempt at improving employee wellness through sleep, à la Arianna Huffington. The idea is to pay employees to get enough sleep, which at first may sound too good to be true. A closer look reveals a couple of troubling dimensions: For one thing, employees enter sleep hours manually into Aetna’s online system, which means they could fudge the numbers. But even if this data were also tracked by wearables (which is true for other parts of Aetna’s wellness program), such plans could pose problems down the line, depending on how a company uses that kind of wellness data. For instance, imagine if your manager began making decisions about which projects to assign you based on whether you were well-rested.

Not all wellness program data can be legally funneled to employers or third parties. It depends on whether the wellness program is inside a company insurance plan—meaning that it would be protected by HIPAA—or outside a company insurance plan and administered by a third-party vendor. If it’s administered by a third party, your data could be passed on to other companies. At that point, the data is protected only by the privacy policies of those third-party vendors, “meaning they can essentially do what they like with it,” De Mooy says.

Most companies that are gathering this information emphasize that they’re doing everything they can to protect users’ data and that they don’t sell it to third-party providers (yet). But when data passes from a device, to a phone, to the cloud through Wi-Fi, even all of the encryption and protective algorithms in the world can’t ensure data security. Many of these programs, like Aetna’s sleep initiative, are optional, but sometimes employees don’t have much of a choice. If they opt out, they often have to pay more for insurance coverage, though companies prefer to frame it as offering a discount to those who participate, as opposed to a penalty for those who don’t.

And even if you choose to opt out, companies may find ways to collect the same data in the future. For example, MIT researchers are able now to detect heart rate and breathing information remotely with 99 percent accuracy from a Wi-Fi signal that they reflect off of your body. “In the future, could stores capture heart rate to show how it changes when you see a new gadget inside a store?” imagines Chuang. “These may be things that you as a consumer may not be able to opt out of.”

Yet there’s another side to this future. The way you walk can be as unique as your fingerprint; a couple of studies show that gait can help verify the identity of smartphone users. And gait can also predict whether someone is at risk for dementia. Seemingly useless pieces of data may let experts deduce or predict certain behaviors or conditions now, but the big insights will come in the next few years, when companies and consumers are able to view a tapestry of different individual data points and contrast them with data across the entire population. That’s when, according to a recent report from Berkeley’s Center for Long-Term Cybersecurity, we’ll be able to “gain deep insight into human emotional experiences.”

But it’s the data that you’re creating now that will fuel those insights. Far from meaningless, it’s the foundation of what you (and everyone else) may be able to learn about your future self.

This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, follow us on Twitter and sign up for our weekly newsletter.