Technology

The End of the Echo Chamber

A study of 250 million Facebook users reveals the Web isn’t as polarized as we thought.

Today, Facebook is publishing a study that disproves some hoary conventional wisdom about the Web. According to this new research, the online echo chamber doesn’t exist.

Illustration by Alex Eben Meyer.

Illustration by Alex Eben Meyer.

This is of particular interest to me. In 2008, I wrote True Enough, a book that argued that digital technology is splitting society into discrete, ideologically like-minded tribes that read, watch, or listen only to news that confirms their own beliefs. I’m not the only one who’s worried about this. Eli Pariser, the former executive director of MoveOn.org, argued in his recent book The Filter Bubble that Web personalization algorithms like Facebook’s News Feed force us to consume a dangerously narrow range of news. The echo chamber was also central to Cass Sunstein’s thesis, in his book Republic.com, that the Web may be incompatible with democracy itself. If we’re all just echoing our friends’ ideas about the world, is society doomed to become ever more polarized and solipsistic?

It turns out we’re not doomed. The new Facebook study is one of the largest and most rigorous investigations into how people receive and react to news. It was led by Eytan Bakshy, who began the work in 2010 when he was finishing his Ph.D. in information studies at the University of Michigan. He is now a researcher on Facebook’s data team, which conducts academic-type studies into how users behave on the teeming network.

Bakshy’s study involves a simple experiment. Normally, when one of your friends shares a link on Facebook, the site uses an algorithm known as EdgeRank to determine whether or not the link is displayed in your feed. In Bakshy’s experiment, conducted over seven weeks in the late summer of 2010, a small fraction of such shared links were randomly censored—that is, if a friend shared a link that EdgeRank determined you should see, it was sometimes not displayed in your feed. Randomly blocking links allowed Bakshy to create two different populations on Facebook. In one group, someone would see a link posted by a friend and decide to either share or ignore it. People in the second group would not receive the link—but if they’d seen it somewhere else beyond Facebook, these people might decide to share that same link of their own accord.

By comparing the two groups, Bakshy could answer some important questions about how we navigate news online. Are people more likely to share information because their friends pass it along? And if we are more likely to share stories we see others post, what kinds of friends get us to reshare more often—close friends, or people we don’t interact with very often? Finally, the experiment allowed Bakshy to see how “novel information”—that is, information that you wouldn’t have shared if you hadn’t seen it on Facebook—travels through the network. This is important to our understanding of echo chambers. If an algorithm like EdgeRank favors information that you’d have seen anyway, it would make Facebook an echo chamber of your own beliefs. But if EdgeRank pushes novel information through the network, Facebook becomes a beneficial source of news rather than just a reflection of your own small world.

That’s exactly what Bakshy found. His paper is heavy on math and network theory, but here’s a short summary of his results. First, he found that the closer you are with a friend on Facebook—the more times you comment on one another’s posts, the more times you appear in photos together, etc.—the greater your likelihood of sharing that person’s links. At first blush, that sounds like a confirmation of the echo chamber: We’re more likely to echo our closest friends.

But here’s Bakshy’s most crucial finding: Although we’re more likely to share information from our close friends, we still share stuff from our weak ties—and the links from those weak ties are the most novel links on the network. Those links from our weak ties, that is, are most likely to point to information that you would not have shared if you hadn’t seen it on Facebook. The links from your close ties, meanwhile, more likely contain information you would have seen elsewhere if a friend hadn’t posted it. These weak ties “are indispensible” to your network, Bakshy says. “They have access to different websites that you’re not necessarily visiting.”

The fact that weak ties introduce us to novel information wouldn’t matter if we only had a few weak ties on Facebook. But it turns out that most of our relationships on Facebook are pretty weak, according to Bakshy’s study. Even if you consider the most lax definition of a “strong tie”—someone from whom you’ve received a single message or comment—most people still have a lot more weak ties than strong ones. And this means that, when considered in aggregate, our weak ties—with their access to novel information—are the most influential people in our networks. Even though we’re more likely to share any one thing posted by a close friend, we have so many more mere acquaintances posting stuff that our close friends are all but drowned out.

In this way, Bakshy’s findings complicate the echo chamber theory. If most of the people we encounter online are weak ties rather than close friends, and if they’re all feeding us links that we wouldn’t have seen elsewhere, this suggests that Facebook (and the Web generally) isn’t simply confirming our view of the world. Social networks—even if they’re dominated by personalization algorithms like EdgeRank—could be breaking you out of your filter bubble rather than reinforcing it.

Bakshy’s work shares some features with previous communications studies on networks, and it confirms some long-held ideas in sociology. (For instance, the idea that weak ties can be important was first floated in a seminal 1973 study by Mark Granovetter.) It also confirms a few other recent studies questioning the echo chamber, including the economists Matthew Gentzkow and Jesse Shapiro’s look at online news segregation.

Facebook CEO Mark Zuckerberg.
A study out today by the Facebook data team strikes a blow against the idea that there is an online echo chamber

Justin Sullivan/Getty Images.

But there are two reasons why Bakshy’s research should be considered a landmark. First, the study is experimental and not merely observational. Bakshy wasn’t just watching how people react to news shared by their friends on Facebook. Instead, he was able to actively game the News Feed to create two different worlds in which some people get a certain piece of news and other, statistically identical, people do not get that news. In this way, his study is like a clinical trial: There’s a treatment group that’s subjected to a certain stimulus and a control group that is not, and Bakshy calculated the differences between the two. This allows him to draw causal relationships between seeing a link and acting on it: If you see a link and reshare it while some other user does not see the link and does not share it, this means that the Facebook feed was responsible for the sharing.

The other crucial thing about this study is that it is almost unthinkably enormous. At the time of the experiment, there were 500 million active users on Facebook. Bakshy’s experiment included 253 million of them and more than 75 million shared URLs, meaning that in total, the study observed nearly 1.2 billion instances in which someone was or was not presented with a certain link. This scale is unheard of in academic sociological studies, which usually involve hundreds or, at most, thousands of people communicating in ways that are far less trackable.

At the same time, there’s an obvious problem with Bakshy’s study: It could only occur with the express consent of Facebook, and in the end it produced a result that is clearly very positive for the social network. The fact that Facebook’s P.R. team contacted me about the study and allowed me to interview Bakshy suggests the company is very pleased with the result. If Bakshy’s experiment had come to the opposite conclusion—that, say, the News Feed does seem to echo our own ideas—I suspect they wouldn’t be publicizing it at all. (Bakshy told me that he has “a good amount of freedom” at the company to research whatever he wants to look into about the social network, and that no one tells him what to investigate and what to leave alone. The study is being submitted to peer-reviewed academic journals.)

Also, so as not to completely tank the ongoing sales of my brilliant book, I’d argue that Bakshy’s study doesn’t indemnify the modern media against other charges that it’s distorting our politics. For one thing, while it shows that our weak ties give us access to stories that we wouldn’t otherwise have seen, it doesn’t address whether those stories differ ideologically from our own general worldview. If you’re a liberal but you don’t have time to follow political news very closely, then your weak ties may just be showing you lefty blog links that you agree with—even though, under Bakshy’s study, those links would have qualified as novel information. (Bakshy’s study covered all links, not just links to news stories; he is currently working on a follow-up that is more narrowly focused on political content.)

What’s more, even if social networks aren’t pushing us toward news that confirms our beliefs, there’s still the question of how we interpret that news. Even if we’re all being exposed to a diverse range of stories, we can still decide whose spin we want—and then we go to the Drudge Report or the Huffington Post to get our own views confirmed.

Still, I have to say I’m gratified by Bakshy’s study. The echo chamber is one of many ideas about the Web that we’ve come to accept in the absence of any firm evidence. The troves of data that companies like Facebook are now collecting will help add some empirical backing to our understanding of how we behave online. If some long-held beliefs get overturned in the process, then all the better.