Future Tense

Facebook’s Latest Move to Fight Fake News Might Finally Be the Right One

Facebook may have finally hit on a promising way to fight its “fake news” problem.

The company on Thursday announced that it is launching a feature called Related Articles, which it has been testing since April. Now, when you see certain controversial or hotly debated stories in your news feed, below them will appear a series of headlines from other publishers on the same topic.

In its April blog post explaining the test, Facebook presented Related Articles as a way to give users “easier access to additional perspectives and information, including articles by third-party fact checkers.” It gave as an example an article about “a new medical advancement,” suggesting that the related stories would help readers evaluate whether the piece their friend shared was accurate or misleading in its presentation of the findings.

This GIF shows how the feature looks in the news feed on your phone.

Courtesy of Facebook

Now, it seems, Facebook is comfortable pitching the feature more explicitly as a tool to counteract the spread of misinformation. In an update to that April blog post Thursday, it wrote:

Since starting this test, we’ve heard that Related Articles helps give people more perspectives and additional information, and helps them determine whether the news they are reading is misleading or false. So we’re rolling this out more broadly.

Now, we will start using updated machine learning to detect more potential hoaxes to send to third-party fact checkers. If an article has been reviewed by fact checkers, we may show the fact checking stories below the original post. In addition to seeing which stories are disputed by third-party fact checkers, people want more context to make informed decisions about what they read and share. We will continue testing updates to Related Articles and other ongoing News Feed efforts to show less false news on Facebook and provide people context if they see false news.

When Facebook says “false news,” it’s referring at least in part to what became popularly known as “fake news” during the 2016 U.S. presidential election. Much of that “fake news”—an ill-defined category that seemed to include everything from deliberate hoaxes to mainstream news stories that some perceived as biased or misleading—revolved around politics and catered to the partisan viewpoints of one group or another. That made it a particularly thorny problem for Facebook, which risked being tarred as politically slanted if it flagged or suppressed posts based on the judgment of its own editors or software engineers.

Facebook was so reluctant to wade into the murky waters of editorial judgment that it first denied “fake news” was a real problem. When that backfired, the company began looking for ways to tackle it in earnest. One of its first major initiatives was a partnership with third-party fact-checkers, in which Facebook would flag potentially false or misleading posts as “disputed” in users’ feeds. The company relied in part on its own users to report those posts, which it could then pass on to the fact-checking organizations for careful vetting.

That approach, cautious as it was, still left Facebook open to charges of bias and even censorship (although that’s a misapplication of the term) from those who took issue with the fact-checkers’ conclusions. The process is also labor-intensive, meaning that only a small fraction of misleading stories would likely be flagged as such in a timely manner. Even if Facebook overcame that problem, psychologists doubted the efficacy of the approach.

Related Articles won’t singlehandedly solve the fake news problem, either. But there is at least some academic research suggesting that it could make a real difference in readers’ perceptions. Just as importantly, from Facebook’s standpoint, it should insulate the company from cries of censorship, since surrounding a story with related articles doesn’t necessarily imply any editorial judgment about its credibility. A reader’s understanding of just about any story could benefit from additional context, so there’s little danger in “false positives,” as there is when you’re flagging an article as disputed.

This still leaves the deeper problem of the biases embedded in the very structure of the news feed. But it’s a sensible measure nonetheless, and one that suggests Facebook is capable of applying its employees’ bright minds to a societal problem broader than increasing users’ engagement or monetizing their data.