There’s a joke attributed to Winston Churchill in which he allegedly asked a young socialite:
Churchill: Madam, would you sleep with me for 5 million pounds?
Socialite: My goodness … well, I suppose.
Churchill: Would you sleep with me for 5 pounds?
Socialite: What type of woman do you think I am?
Churchill: We’ve already established that. Now we are haggling about the price.
So what’s your price?
The personal data you share via the Internet, mobile phone, and by simply living your life has economic value. While most discussions surrounding this data focus on privacy concerns, it’s critical to understand how the paradigm of Internet and mobile business models has established a precedent regarding our digital identities. Put simply, we’re cybersluts. While the valuation of the data representing our identities has never been higher, we share it with virtually anyone at any price.
Right now, we don’t see this problem. Literally. It’s early days for augmented reality, in which devices placed over our eyes will become the lens through which we see the world. And it’s early days for facial recognition, in which our physical identity can trigger multiple digital actions. These technologies are themselves neither good nor evil, but within the next five to seven years, they will allow people, or businesses, to see any available information about our lives floating before their eyes.
We typically only think of the worth of our data in transactional terms. Marketers create complex algorithms valuating people’s information based on social network usage where their data could range from $4 a month in value to more than $1,200 a year. NYU grad student Federico Zannier even stalked himself online to mine his data to sell on Kickstarter, where people could pay $5 for a week of his information and circumvent a third-party data broker like Acxiom and its $1.1 billion annual revenue.
But now we’re haggling.
The idea of people controlling and selling their data for personal and economic gain—as Jaron Lanier describes in Who Owns the Future? and Doc Searls elaborated on in The Intention Economy—is gaining traction. But it won’t take hold until we answer a more deeply fundamental question: What are we worth as a whole? It’s time to stop thinking about our data being analogous to a password. In aggregate, it represents our person.
The U.S. Government Accountability Office is a nonpartisan agency that works for Congress to investigate how the government spends taxpayer dollars. In a report from September 2013, it points out a key reason consumers aren’t currently selling their data for personal profit—they can’t control it. The report states:
While current laws protect privacy interests in specific sectors and for specific uses, consumers have little control over how their information is collected, used, and shared with third parties for marketing purposes.
Whatever your preference regarding how to share your data, you should have a legal right to access it. But this line of reasoning still minimizes the holistic value of the data reflecting our lives, as it defines its importance only in terms of marketing and income. As William Hoffman, head of Data Driven Development for the World Economic Forum, said when I spoke to him, “ ‘Do you get paid?’ is only one sliver of the issue.” Given the information that’s going to be generated by devices in the fields of health, logistics, transportation, and quantified self, Hoffman feels a better question to ask is “Can people get value from their data the way it’s used and packaged?”
Right now, wearable devices tracking your health are becoming popular. Reflecting the notion from Socrates that the unexamined life is not worth living, these devices empower people by showing them their personal data in a whole new light.* The invisible is being revealed. Charts and visualizations provide hard evidence of your exercise and sleep patterns. This data—these sine waves, pie charts, and spreadsheets—are a journal representing your unique identity. They’re intimate snapshots documenting the workings of your inner life.
Research shows that seeing this type of data in conjunction with a new health or diet regimen provides a form of accountability for users. By avoiding guesswork about their weight or run times and knowing they’ll see clear visualizations of their progress, they’re encouraged to work harder and improve their results. By leveraging their data for themselves, they’re empowered by the insights generated from their individual experience.
It’s these insights, however, that we share too lightly. We tweet our weight loss or our heart rate change over the course of last week’s runs in the park. Taken on their own, these posts seem innocent enough. But sharing physical data in the public realm means insurance companies and future employers have access to our health history. Timestamps on our exercise data mean people know when we’re away from our homes. Dating sites, many of which have the most advanced algorithms of any current industry, could factor in this physical data to predict whether your health choices match up to those of potential mates.
All of these insights, generated from our unique lives, will bring economic value to others without our knowledge or consent. And that’s on an ongoing basis—data you share once may be utilized dozens of times for years. Taken out of context, used in ways you don’t approve of, sold to the lowest bidder. Go ahead, share your stats. Let the Inter-pimps have their way.
Our ennui on the issue of data sharing also doesn’t remove our culpability. Not taking action equals complicity. And our preferences regarding privacy are irrelevant in regard to the Internet of Things, a catchall term referring to the myriad sensors and chips found in everything from mailboxes to freeway transit passes. The fact that you don’t share personal data on Facebook doesn’t mean Bluetooth tracking devices don’t monitor your moves in a retail environment. The fact you don’t own a smartphone doesn’t mean your faceprint can’t be registered in an unknown database. Whether or not you actively choose to share your data doesn’t mean algorithms built on the bias of a data broker won’t affect your life in the future.
Our personal data is about more than money. If we don’t wrest data control from brokers before emerging technologies like augmented reality kick in, we may not even be able to manage how we’re viewed to the outside world. It’s time to shift the conversation around personal data beyond privacy and pricing so we can fully value our digital lives.
Otherwise, the cost is just too high.
*Update, March 18, 2014: This sentence was revised to better paraphrase Socrates' quote.
This article is part of Future Tense, a collaboration among Arizona State University, the New America Foundation, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.