The Surprising New Feature That Proves Facebook Isn’t Pure Evil After All

The Citizen's Guide to the Future
March 25 2014 4:15 PM

Facebook's Privacy Dinosaur Wants to Make Sure You're Not Oversharing

Facebook's friendly blue privacy dinosaur
A message from Facebook's friendly privacy dinosaur.

Screenshot courtesy of Facebook

Everyone knows it’s in Facebook’s interest to get us all to overshare, right? The more we post, the looser our privacy settings, the more data the company gleans, and the better it can target us with personalized advertisements. Not only that, but the more we post publicly, the more useful Facebook becomes as a search engine, a news service, a virtual water cooler, and a recommendation service. (To understand why, just imagine how worthless Yelp would be if a majority of its users turned their reviews to “private” or “friends-only.”)

Will Oremus Will Oremus

Will Oremus is Slate's senior technology writer.

The baseline assumption for many people these days is that Facebook will take all the data it can get, and only offers privacy options when it absolutely has to.

Advertisement

And yet a new privacy measure, which Facebook is testing this week on a subset of users, suggests the company might not be quite as shameless or rapacious as many of its users—and the media—tend to assume. It’s called “privacy checkup,” and it takes the form of a pop-up box that appears one time per user, just as you’re about to post something publicly on your timeline.

Facebook privacy message

In case you’re unable to believe your eyes, I’ll confirm that this is indeed an instance of Facebook checking in to make sure you’re not accidentally sharing things with a wider audience than you realized. That little blue dinosaur, it turns out, is a friendly privacy dinosaur.

You might suspect this is actually a clumsy attempt on Facebook’s part to get people to change their settings from “friends” to “public.” But it isn’t, at least for now: A Facebook spokesman told me the only people seeing the pop-up box in the current test are those who are already posting publicly on Facebook. From Facebook’s perspective, then, this feature can only lead to one thing: Fewer people sharing publicly.

Why, then, would Facebook do this? I can think of two plausible motivations.

The first is that, believe it or not, Facebook understands more than ever that maintaining its users’ trust—or at least a modicum of it—is crucial to its long-term survival. Maintaining access to users’ data is crucial, too, of course. But I think Facebook recognizes that no one wins when people accidentally share things with people they didn’t mean to share them with.

In Mark Zuckerberg’s ideal, “open and connected” world, people would probably be far more comfortable sharing things publicly than they are today. But that’s not the world we live in, and he knows it’s better for Facebook to keep its shyer users sharing privately than to lose them altogether. And remember: Facebook also just spent a cool $19 billion on a messaging service that is inherently private. Clearly the company is well aware of the value and importance of non-public communications.

The second motivation for Facebook to run a test like this is simple: It’s good PR. Remember, the people seeing these Privacy Checkups are those who are already sharing publicly—which means that the test group probably includes a disproportionate number of celebrities, public figures, and media-and-advertising types. Some of them, no doubt, will write articles like this one—or this one, or this one. And here’s something you don’t see every day: an international privacy group congratulating Facebook on looking out for its users.

The fact is, Facebook has never been pure evil, and it will never be purely good either. It’s a company whose interests are sometimes in tension with those of its users, but only to a certain extent. In the end, Facebook only wins if its users feel safe using it. Zuckerberg and company probably could have spared themselves an awful lot of backlash and mistrust if they had fully embraced that lesson a little earlier on. 

Previously in Slate:

Future Tense is a partnership of SlateNew America, and Arizona State University.

Will Oremus is Slate's senior technology writer.

TODAY IN SLATE

Foreigners

More Than Scottish Pride

Scotland’s referendum isn’t about nationalism. It’s about a system that failed, and a new generation looking to take a chance on itself. 

What Charles Barkley Gets Wrong About Corporal Punishment and Black Culture

Why Greenland’s “Dark Snow” Should Worry You

Three Talented Actresses in Three Terrible New Shows

Why Do Some People See the Virgin Mary in Grilled Cheese?

The science that explains the human need to find meaning in coincidences.

Jurisprudence

Happy Constitution Day!

Too bad it’s almost certainly unconstitutional.

Is It Worth Paying Full Price for the iPhone 6 to Keep Your Unlimited Data Plan? We Crunch the Numbers.

What to Do if You Literally Get a Bug in Your Ear

  News & Politics
Weigel
Sept. 16 2014 7:03 PM Kansas Secretary of State Loses Battle to Protect Senator From Tough Race
  Business
Moneybox
Sept. 16 2014 4:16 PM The iPhone 6 Marks a Fresh Chance for Wireless Carriers to Kill Your Unlimited Data
  Life
The Eye
Sept. 16 2014 12:20 PM These Outdoor Cat Shelters Have More Style Than the Average Home
  Double X
The XX Factor
Sept. 15 2014 3:31 PM My Year As an Abortion Doula
  Slate Plus
Slate Plus Video
Sept. 16 2014 2:06 PM A Farewell From Emily Bazelon The former senior editor talks about her very first Slate pitch and says goodbye to the magazine.
  Arts
Brow Beat
Sept. 16 2014 8:43 PM This 17-Minute Tribute to David Fincher Is the Perfect Preparation for Gone Girl
  Technology
Future Tense
Sept. 16 2014 6:40 PM This iPhone 6 Feature Will Change Weather Forecasting
  Health & Science
Medical Examiner
Sept. 16 2014 11:46 PM The Scariest Campfire Story More horrifying than bears, snakes, or hook-handed killers.
  Sports
Sports Nut
Sept. 15 2014 9:05 PM Giving Up on Goodell How the NFL lost the trust of its most loyal reporters.