Facebook study: The algorithm doesn’t push political polarization. However.

Facebook Isn’t Just Making Us Less Partisan. It’s Making Us Less Politically Engaged.

Facebook Isn’t Just Making Us Less Partisan. It’s Making Us Less Politically Engaged.

Notes on the culture of the Internet.
May 11 2015 12:26 PM

All the News That’s Fit to Like

Facebook isn’t just making us less partisan. It’s making us less politically engaged. 

A Facebook logo on a computer.
Are you seeing the world through Facebook-colored lenses?

Photo illustration by Slate. Photo by Thomas Hodel/Reuters, JPEGTOII2/MED.

When Facebook entered the news business in 2006, it set out to cover its own users. Facebook had launched as a static collection of profiles, but now, every time a user uploaded a new photo or changed her favorite quote, the development surfaced in a rolling stream of updates that Facebook called the “News Feed.” Every status update was a “news story”; the algorithm that chose which stories to boost was called “the publisher.” The publisher, Facebook told its users at the time, was interested in stories like “Mark adds Britney Spears to his Favorites” and “your crush is single again.” As David Kirkpatrick reported in his 2010 book The Facebook Effect, Mark Zuckerberg articulated the News Feed’s guiding principle to staff like so: “A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa.” The New York Times would cover the African conflict. The News Feed would show you the squirrel.

Now Facebook is poised to begin publishing New York Times stories directly to its own site. Last year the Pew Research Center deemed Facebook the second-most popular source for political and government news among American Internet users, just behind local TV. Facebook has officially entered the news-news business. What kinds of stories does its publisher value now?

A new study published online in Science last week sheds some light. Three researchers, all Facebook employees, culled data from 10 million Facebook users, 7 million news articles shared on the site, and the users’ combined 3.8 billion “potential exposures” to that content in order to find out how “ideologically diverse news and opinion” spreads (or doesn’t) among liberal and conservative users. They found that the News Feed algorithm—which has long been accused of shielding users from politically oppositional content—decreased the visibility of ideologically “cross-cutting” news by 8 percent for liberals and 5 percent for conservatives. From there, liberals were 6 percent less likely to click on a story from a conservative source (like Fox News), while conservatives were 17 percent less likely to click over to a left-leaning site (like the Huffington Post). The Facebook researchers concluded that “[i]ndividual choice has a larger role in limiting exposure to ideologically cross cutting content” than Facebook’s engineers do. All in all, users are “exposed to more cross-cutting discourse in social media” than we had all thought.


The study is a clever bit of misdirection. I don’t doubt its results—getting your news from Facebook isn’t as ideologically isolating as, say, watching Fox News or MSNBC. But its title, “Exposure to ideologically diverse news and opinion on Facebook,” makes Facebook sound like a pulsing marketplace of political opinion and news. Meanwhile, I’m scrolling down my News Feed and finding videos of Tina Fey faux-stripping and an orangutan cuddling an armful of tiger cubs. Facebook may help nudge liberals a little to the left and conservatives a little to the right, but its greatest influence over Americans is toward political disengagement.

The “liberals” and “conservatives” tracked in the Facebook study actually represent a slim slice of the site’s users. The study included only people who proudly complete the “political views” section of their profile, and just 4 percent of adult Americans on Facebook fit the bill. That’s a curious group to focus on, because the generation that’s most active on Facebook is also the least likely to identify with a political orientation. Last year’s Pew report on the beliefs and behaviors of millennials—81 percent of whom are on Facebook—found that a full 50 percent of millennials consider themselves politically independent. Their political “disaffiliation” rivals or exceeds that of any group Pew has ever studied in its 25-year existence. Instead, millennials “are building their own networks,” Pew concluded—not “through political parties, organized religion or marriage” but “through social media.” Facebook isn’t just facilitating communication between members of different political parties. It is replacing political parties.

Millennials, the New America Foundation found last year, are less likely to vote or pledge allegiance to a party, but they do engage in some “civic uses of social media.” (Slate has a publishing partnership with the New America Foundation.) Here’s one sad data point supporting that conclusion: Forty-four percent of millennials have “liked” a piece of political material on social media. Meanwhile, a study published last year in New Media & Society found that being active on Facebook does not encourage teenagers to become more politically engaged. It doesn’t inspire them to join protests or sign petitions or affix buttons or even post political thoughts on the Internet. But it does inspire them to spend more time entertaining themselves—chatting with friends, downloading songs, shopping online, and engaging in other “consumerist-oriented” activities. This is a convenient outcome for Facebook. It’s easy to see why Facebook would prefer young people to see Facebook as a shopping mall rather than a soapbox. These users are easier to monetize and less likely to offend.

After Eli Pariser, Upworthy CEO and the author of The Filter Bubble: What the Internet Is Hiding From You, parsed the Facebook study, he admitted that the algorithm’s contribution to the partisan divide was “smaller than I’d have guessed.” But he also noted that the study elided a more basic question: How does Facebook’s algorithm manipulate the spread of news in general? One of the more revelatory data points embedded in the study concerns the proportion of “hard content” (stories about stuff like campaigns, war, health care, and abortion) and “soft content” (links about sports, entertainment, food, gadgets, and fashion) shared on the site. After assessing millions of links, the researchers declared over 90 percent of them soft. Really soft: Examples include a Blind Melon video, a Cyber Monday sale, and a link to a collection of inspirational photos paired with motivational quotes. And the study assessed only the content that links off of Facebook. Consider all the engagement announcements and pet photos that dominate your News Feed, and Facebook’s journalistic priorities appear even fluffier.

In theory, Facebook presents an unprecedented opportunity for political cross-pollination. In a 2012 study on polarization and social media, Stanford researchers cited studies dating back to 1967 showing that “people do not encounter attitude-challenging information in large part due to their social milieu, habits, and lack of perceived benefits for seeking out such information.” But on Facebook, users connect with friends, past friends, extended family members, co-workers, neighbors, and strangers, many of whom are likely to hold political beliefs that differ from their own. Just don’t expect them to talk about it on Facebook. Last year Pew studied how Americans discussed Edward Snowden’s leaking of NSA documents and found that while 86 percent of Americans would share their opinion among friends over dinner, only 42 percent of social media users were willing to post about it online. Americans were “more willing to share their views if they thought their audience agreed with them.” And the silence followed Facebook users even after they logged off: Facebookers were 50 percent less likely to discuss Snowden in person than nonusers were.

So instead of sparking political debates, Facebook users convene over the soft stuff. The Facebook researchers found that while pieces of hard news tend to circulate in ideological silos, soft content percolates across the aisle. I may scroll past one liberal friend’s links expressing unconditional allegiance to Hillary Clinton, but I’ll stop and like her video of a Great Dane puppy throwing a temper tantrum. This makes Facebook a powerful force for human connection, but a poor destination for political engagement. The Facebook Effect’s Kirkpatrick put a rosier spin on the situation: Now that we include Facebook on our list of legitimate news sources, he said, journalists can boast that young people are reading more news than ever before. The one catch: “They’re just reading about their friends.”