Future Tense

Facebook’s Censorship Problem Is What Happens When a Tech Company Controls the News

Facebook deleted this post by a Norwegian newspaper criticizing the social network for censorship.

Aftenposten/Facebook

In the space of a single day, Facebook has managed to:

Did I mention that Facebook also promoted in its trending news section on Friday a story claiming that the Sept. 11 attacks were an inside job? And then, as “a temporary step,” removed the entire trending topic relating to the upcoming 9/11 anniversary?

It’s just another day on the world’s most influential news platform, whose founder and CEO continues to insist that it is not a media company.

My colleague Jacob Brogan has more about the dustup over the 9/11 truther story, which is only the latest in a series of embarrassing blunders resulting from Facebook’s misguided attempt to exorcise human judgment from its trending news section.

The photo censorship episode is somewhat different, although it has its roots in the same conundrum. It revolves around the Pulitzer Prize–winning 1972 photo “The Terror of War,” which depicts terrified children fleeing a napalm attack in Vietnam. One of the children was a naked and badly burned 9-year-old girl. The photo has been credited by some with hastening the end of the Vietnam War.

When a Norwegian writer published a series of Facebook posts that included the photograph, Facebook suspended him from the site, according to Aftenposten, the country’s largest print newspaper. The social network had determined that the image violated its anti-nudity policy.

When Aftenposten reported on the censorship in a news article that it shared on its Facebook page along with the photo in question, Facebook deleted the newspaper’s post, too. Aftenposten on Friday responded with a front-page open letter denouncing Facebook’s “abuse of power,” and Norway’s prime minister, Erna Solberg, weighed in with her own Facebook post criticizing the the company, again including the photo. Facebook went ahead and deleted that as well.

Pressed by reporters Friday, Facebook defended the decision with the following statement:

While we recognize that this photo is iconic, it’s difficult to create a distinction between allowing a photograph of a nude child in one instance and not others. We try to find the right balance between enabling people to express themselves while maintaining a safe and respectful experience for our global community. Our solutions won’t always be perfect, but we will continue to try to improve our policies and the ways in which we apply them.

As the outcry grew, Facebook flip-flopped and promised to reinstate the posts it had taken down. It issued a second statement explaining that choice:

After hearing from our community, we looked again at how our Community Standards were applied in this case. An image of a naked child would normally be presumed to violate our Community Standards, and in some countries might even qualify as child pornography. In this case, we recognize the history and global importance of this image in documenting a particular moment in time. Because of its status as an iconic image of historical importance, the value of permitting sharing outweighs the value of protecting the community by removal, so we have decided to reinstate the image on Facebook where we are aware it has been removed. We will also adjust our review mechanisms to permit sharing of the image going forward. It will take some time to adjust these systems but the photo should be available for sharing in the coming days. We are always looking to improve our policies to make sure they both promote free expression and keep our community safe, and we will be engaging with publishers and other members of our global community on these important questions going forward.

The photo controversy differs from the company’s trending-news snafus in that it centers on Facebook’s signature product, the news feed. In that respect, it’s a bigger deal.

The trending news section is peripheral to the Facebook experience, and its decisions only marginally affect what the company’s massive audience reads. The news feed, in contrast, is a daily or even hourly addiction for a significant portion of the world’s population. Some 66 percent of U.S. Facebook users say they rely on it for news, which equates to 44 percent of all U.S. adults. Accordingly, the news feed has become a critical distribution channel for major news outlets—and, as a result, a reluctant arbiter of what’s most relevant in the news.

Media companies, generally speaking, accept responsibility for the content they publish, including its newsworthiness. But, as I’ve explained, Facebook is loath to do that, because exercising editorial judgment is both controversial and labor-intensive. That’s one big reason why traditional media companies are both relatively poor and widely mistrusted while Facebook is wildly rich and relatively well-liked.

In place of human editors, the company has fallen back on one-size-fits-all policies, such as its prohibition against nudity, which can be enforced via a combination of proprietary software and poorly paid contractors working in warehouses in Manila. That approach has helped to fuel the company’s rapid growth and enormous profits. But as the social network’s dominance of media grows, its drawbacks are becoming increasingly difficult to ignore.

When Mark Zuckerberg says that Facebook isn’t a media company, it’s not so much a descriptive claim as a wishful one. Facebook doesn’t want to be a media company, for the reasons outlined above. That’s understandable: Editorial judgment is perilous territory, and as much as we cry out when Facebook clumsily enforces a blunt policy, the cries would only grow louder if the company were to take a more activist approach to the types of content it permits and promotes.

Yet the company’s persistent claims to neutrality, which were philosophically empty from the outset, are further undermined each time it changes its policies in response to public pressure, or makes an exception for “an iconic image of historic importance,” or offers a rationale such as, “the value of permitting sharing outweighs the value of protecting the community by removal.”

That is so naked a value judgment that Facebook’s PR team couldn’t even find a word other than “value” in which to cloak it. In essence, with its second statement on Friday reversing the ban on the “Terror of War” photo, Facebook just admitted to all of the following:

  • It does make value judgments about the content people are allowed to post in the news feed.
  • It is willing to make such judgments on a case-by-case basis.
  • In doing so, it is prepared to consider claims of newsworthiness and historical importance, while considering such factors as source, cultural context, and feedback from its readers.

In other words: Facebook is a media company, when it’s forced to be. It’s just a media company that’s determined to accept as little responsibility as possible for what it publishes—and which alternately abdicates and reclaims its prerogative to exercise editorial judgment according to what it deems expedient at any given time. Remind me again why the public is so quick to trust technology companies?