Facebook Wants to Know Why You’re Self-Censoring Your Posts

What's to come?
Dec. 13 2013 11:20 AM

On Second Thought …

Facebook wants to know why you didn’t publish that status update you started writing.

This rash is getting wor
It's at this point that you reconsider your status update.

Photo by Slate

A couple of months ago, a friend of mine asked on Facebook:

Do you think that facebook tracks the stuff that people type and then erase before hitting <enter>? (or the “post” button)

Good question.

Advertisement

We spend a lot of time thinking about what to post on Facebook. Should you argue that political point your high school friend made? Do your friends really want to see yet another photo of your cat (or baby)? Most of us have, at one time or another, started writing something and then, probably wisely, changed our minds.

Unfortunately, the code in your browser that powers Facebook still knows what you typed—even if you decide not to publish it.* It turns out that the things you explicitly choose not to share aren't entirely private.

Facebook calls these unposted thoughts "self-censorship," and insights into how it collects these nonposts can be found in a recent paper written by two Facebookers. Sauvik Das, a Ph.D. student at Carnegie Mellon and summer software engineer intern at Facebook, and Adam Kramer, a Facebook data scientist, have put online an article presenting their study of the self-censorship behavior collected from 5 million English-speaking Facebook users. (The paper was also published at the International Conference on Weblogs and Social Media.*) It reveals a lot about how Facebook monitors our unshared thoughts and what it thinks about them.

The study examined aborted status updates, posts on other people's timelines, and comments on others' posts. To collect the text you type, Facebook sends code to your browser. That code automatically analyzes what you type into any text box and reports metadata back to Facebook.

Storing text as you type isn't uncommon on other websites. For example, if you use Gmail, your draft messages are automatically saved as you type them. Even if you close the browser without saving, you can usually find a (nearly) complete copy of the email you were typing in your Drafts folder. Facebook is using essentially the same technology here. The difference is that Google is saving your messages to help you. Facebook users don't expect their unposted thoughts to be collected, nor do they benefit from it.

It is not clear to the average reader how this data collection is covered by Facebook's privacy policy. In Facebook’s Data Use Policy, under a section called "Information we receive and how it is used," it’s made clear that the company collects information you choose to share or when you "view or otherwise interact with things.” But nothing suggests that it collects content you explicitly don’t share. Typing and deleting text in a box could be considered a type of interaction, but I suspect very few of us would expect that data to be saved. When I reached out to Facebook, a representative told me that the company believes this self-censorship is a type of interaction covered by the policy.

In their article, Das and Kramer claim to only send back information to Facebook that indicates whether you self-censored, not what you typed. The Facebook rep I spoke with agreed that the company isn’t collecting the text of self-censored posts. But it’s certainly technologically possible, and it’s clear that Facebook is interested in the content of your self-censored posts. Das and Kramer’s article closes with the following: "we have arrived at a better understanding of how and where self-censorship manifests on social media; next, we will need to better understand what and why." This implies that Facebook wants to know what you are typing in order to understand it. The same code Facebook uses to check for self-censorship can tell the company what you typed, so the technology exists to collect that data it wants right now.

It is easy to connect this to all the recent news about NSA surveillance. On the surface, it's similar enough. An organization is collecting metadata—that is, everything but the content of a communication—and analyzing it to understand people's behavior. However, there are some important differences. While it may be uncomfortable that the NSA has access to our private communications, the agency is are monitoring things we have actually put online. Facebook, on the other hand, is analyzing thoughts that we have intentionally chosen not to share.

This may be closer to the recent revelation that the FBI can turn on a computer's webcam without activating the indicator light to monitor criminals. People surveilled through their computers’ cameras aren’t choosing to share video of themselves, just as people who self-censor on Facebook aren’t choosing to share their thoughts. The difference is that the FBI needs a warrant but Facebook can proceed without permission from anyone.

Why does Facebook care anyway? Das and Kramer argue that self-censorship can be bad because it withholds valuable information. If someone chooses not to post, they claim, "[Facebook] loses value from the lack of content generation." After all, Facebook shows you ads based on what you post. Furthermore, they argue that it’s not fair if someone decides not to post because he doesn't want to spam his hundreds of friends—a few people could be interested in the message. "Consider, for example, the college student who wants to promote a social event for a special interest group, but does not for fear of spamming his other friends—some of who may, in fact, appreciate his efforts,” they write.

This paternalistic view isn’t abstract. Facebook studies this because the more its engineers understand about self-censorship, the more precisely they can fine-tune their system to minimize self-censorship’s prevalence. This goal—designing Facebook to decrease self-censorship—is explicit in the paper.

So Facebook considers your thoughtful discretion about what to post as bad, because it withholds value from Facebook and from other users. Facebook monitors those unposted thoughts to better understand them, in order to build a system that minimizes this deliberate behavior. This feels dangerously close to “ALL THAT HAPPENS MUST BE KNOWN,” a motto of the eponymous dystopian Internet company in Dave Eggers’ recent novel The Circle.

Update, Dec. 15, 2013: This article was updated to include the fact that Das and Kramer's paper was published at the International Conference on Weblogs and Social Media in addition to being posted online.

Update, Dec. 16, 2013: This article was updated to clarify that it is the browser code, not Facebook, that reads whatever you type.

Disclosure, Feb. 27, 2014: The author of this piece is serving as a technical witness in a patent lawsuit against Facebook.

This article is part of Future Tense, a collaboration among Arizona State University, the New America Foundation, and SlateFuture Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.

TODAY IN SLATE

Doublex

Crying Rape

False rape accusations exist, and they are a serious problem.

Scotland Is Just the Beginning. Expect More Political Earthquakes in Europe.

No, New York Times, Shonda Rhimes Is Not an “Angry Black Woman” 

Brow Beat
Sept. 19 2014 1:39 PM Shonda Rhimes Is Not an “Angry Black Woman,” New York Times. Neither Are Her Characters.

The Music Industry Is Ignoring Some of the Best Black Women Singing R&B

How Will You Carry Around Your Huge New iPhone? Apple Pants!

Medical Examiner

The Most Terrifying Thing About Ebola 

The disease threatens humanity by preying on humanity.

Television

The Other Huxtable Effect

Thirty years ago, The Cosby Show gave us one of TV’s great feminists.

There’s a Way to Keep Ex-Cons Out of Prison That Pays for Itself. Why Don’t More States Use It?

Why Men Can Never Remember Anything

The XX Factor
Sept. 19 2014 1:11 PM Why Men Can Never Remember Anything
Behold
Sept. 19 2014 11:33 AM An Up-Close Look at the U.S.–Mexico Border
  News & Politics
Foreigners
Sept. 19 2014 1:56 PM Scotland’s Attack on the Status Quo Expect more political earthquakes across Europe.
  Business
Moneybox
Sept. 19 2014 3:24 PM Why Innovators Hate MBAs
  Life
Inside Higher Ed
Sept. 19 2014 1:34 PM Empty Seats, Fewer Donors? College football isn’t attracting the audience it used to.
  Double X
The XX Factor
Sept. 19 2014 4:58 PM Steubenville Gets the Lifetime Treatment (And a Cheerleader Erupts Into Flames)
  Slate Plus
Slate Picks
Sept. 19 2014 12:00 PM What Happened at Slate This Week? The Slatest editor tells us to read well-informed skepticism, media criticism, and more.
  Arts
Brow Beat
Sept. 19 2014 4:48 PM You Should Be Listening to Sbtrkt
  Technology
Future Tense
Sept. 19 2014 5:03 PM White House Chief Information Officer Will Run U.S. Ebola Response
  Health & Science
Medical Examiner
Sept. 19 2014 12:13 PM The Most Terrifying Thing About Ebola  The disease threatens humanity by preying on humanity.
  Sports
Sports Nut
Sept. 18 2014 11:42 AM Grandmaster Clash One of the most amazing feats in chess history just happened, and no one noticed.