The Big Idea

Bubble Trouble

Is Web personalization turning us into solipsistic twits?

Eli Pariser

The first conversation I ever had about the Internet was in 1993 with Robert Wright, who was then a colleague at the  New Republic. This “Net” thing was going to be a big deal, I remember Bob telling me, but it could create a few problems. One was that it was going to empower crazies, since geographically diffuse nut jobs of all sorts would be able to find each other online. Another was that it could hurt democratic culture by encouraging narrow-minded folk to burrow deeper into their holes. Wright spelled out those concerns in an article that stands as a model of prescience and a delightful time-capsule. (“People who ‘post’ on the Net’s many different bulletin boards—its ‘newsgroups’—know that their words can be seen from just about any chunk of inhabited turf on this planet.”)

Eighteen years later, our lingo has evolved, but the worries haven’t changed much. Wright’s first concern, about digital technology empowering terrorists and fanatics, has clearly been borne out. His second, about the Internet fostering mental rabbit warrens, remains an open issue. In his new book, The Filter Bubble, Eli Pariser, the former director of the liberal activist group Moveon.org, argues that an informational dystopia is finally arriving. Thanks to advances in personalization, we are all getting more of what we like and agree with, and less that challenges our beliefs. Pariser sees these tools undermining civic discourse. “The filter bubble pushes us in the opposite direction,” he writes. “It creates the impression that our narrow self-interest is all that exists.” The loss of an informational commons, he frets, is making us closed-minded, less intellectually adventurous, and more vulnerable to propaganda and manipulation. Pariser’s qualms echo those expressed by Nicholas Negroponte and Cass Sunstein,who have warned about the Web turning into everybody’s narcissistic “Daily Me” feed. 

The dark side to personalization has special relevance to those of us working at the intersection of journalism and technology.  While the Web has provided consumers with a means to individualize their commerce and entertainment choices, it hasn’t, until recently, done so with news per se. But investment is now flowing into just this kind of personalization filter. The Washington Post Co., which owns Slate, recently launched Trove, “a personalized news and information engine,” with which I’ve been peripherally involved. The New York Times Co. has News.me, a subscription-based “personalized news service.” Flipboard and Zite, which create personalized “magazines” for tablets based on your Facebook and Twitter feeds, are new Silicon Valley darlings.

Extrapolating from all this activity, and from expanding efforts to customize search and social media experiences online, it’s now possible to imagine a world in which every person creates his own mental fortress and apprehends the outside world through digital arrow-slits. But is this long-standing theoretical fear becoming an actual problem in our society? Pariser’s favorite factoid, with which he starts his book, is that Google now personalizes search results according to 57 different signals, even if you’re not logged in through a Gmail account. You’d think someone worried about the hazards of personalization would pay more attention to Facebook, but Pariser believes that Google’s 57 varieties include, or amount to, ideological frames

Pariser believes that Google’s 57 varieties include, or amount to, ideological frames. Last year, he says he asked two women friends, who shared liberal political views, to search the term “BP.” One woman saw investment information about the company. The other saw news about the oil spill.  Amazingly, this single anecdote is all he offers by way of support for the central claim of his book.

Doubting the accuracy of Pariser’s assertion, I asked for a few of my Twitter followers and Facebook friends to search four terms that seemed likely to show ideological fragmentation: “John Boehner,” “Barney Frank,”“Ryan Plan,” and “Obamacare.”  My five volunteers were:

  • Tom, my Republican cousin-in-law who works on Wall Street
  • Jake, who says he’s an independent and works as an insurance consultant in Dubuque, Iowa
  • Steven, a moderate Democrat and small business owner in Royal Oak, Mich.
  • Pat, a former Slate developer, who is liberal and lives in Chicago 
  • Fred, an old college roommate, who works in transportation and says he’s a “left-of-democrat quasi-socialist”

There were only minor discrepancies in the screen shots they sent back for these queries. The insurance consultant from Dubuque got Wikipedia entries for the two congressmen ahead of their own official websites, while all the others got the official sites first. But none of the minor variations aligned in any apparent way with anyone’s political views. For Boehner, for instance, all of the testers—and I—got the same hostile site as the fifth return.

Google’s response to this, when I asked for comment, was a statement about the need to balance personal relevance and diversity. “We actually have algorithms in place designed specifically to limit personalization and promote variety in the results page,” a spokesman emailed me. Independent analysts aren’t seeing a problem, either. Jonathan Zittrain, a professor of law and computer science at Harvard, who studies Web censorship, agrees that Google isn’t doing what Pariser says it is. “In my experience, the effects of search personalization have been light,” he told me. It is true, of course, that if you consistently click on search results from any news source—whether Fox News or the New York Times—that source will rise in your rankings, just as “liking” something or clicking on items in your Facebook news feed increases the likelihood of more of the same turning up. But in the past 15 years, fears that we’re all feeding at the trough of a “Daily Me” haven’t gotten much closer to reality.

Why haven’t the geniuses of the Web figured out how to personalize news? The answer is that it’s a very tough problem. When it comes to movies or songs, algorithms can compare data about my preferences against a large corpus of material that stays mostly constant from day to day. But news changes constantly. To predict what news I want, an algorithm has to gauge my interest in events that haven’t happened yet. Human editors are still way better than machines at this task, and I’m not betting on Watson to defeat Jill Abramson anytime soon. Better algorithmic personalization may come to supplement human curation, but I seriously doubt it will supplant it.

Pariser is also dead wrong, it seems to me, in assuming that personalization narrows our perspectives rather than broadening them. Through most of history, bubbles have been imposed involuntarily. Not so long ago, most Americans got their news primarily through three like-minded networks and local newspapers that reflected a narrow consensus. With something approaching the infinite choices on the Web, no one has to be limited in this way.  Why assume that when people have more options, they will choose to live in an echo chamber? A couple of studies have shown, for instance, that conservative and liberal bloggers link to each other to a surprising degree. If you want to get all your news from Glenn Beck and right-wing talk radio, you can do that, too.  But my own experience is that personalization, where it works effectively, means more diversity of sources and views. Thanks to Twitter, I learn about the revolutions in the Middle East via Arab activists and writers, not just from American foreign correspondents.

If our society is experiencing a “filter bubble” at the moment, it’s probably a financial rather than intellectual one, as too much investment is directed at tools to manipulate content, and not enough at publishers who create it. But if you’re losing sleep about Google giving you a skewed, blinkered perspective on the world, there’s a very simple solution. You can shut the customization feature off.