Facebook’s “Trending” algorithm makes me ashamed.

Facebook’s “Trending” Is the Worst Place on the Internet

Facebook’s “Trending” Is the Worst Place on the Internet

The citizen’s guide to the future.
Oct. 22 2015 2:30 PM
FROM SLATE, NEW AMERICA, AND ASU

Facebook’s “Trending” Is the Worst Place on the Internet

It knows me too well.

151022_USERS_Facebook-Trending-Away-My-Dignity
What Facebook’s “Trending” thinks I’m interested in.

Photo illustration by Lisa Larson-Walker. Photos by Getty Images.

On Wednesday morning, I logged onto Facebook and felt my eyes gravitate toward the right side of the screen, to the little square of real estate where Facebook lists the topics that are “trending” on the network at that very moment. At 10:29 a.m. EDT, the top three topics listed on my Facebook page were: “Channing Tatum: Actor Posts Photo of Him Dressed as Winnie-the-Pooh”; “Miley Cyrus: Singer Wears Star of David Outfit for Actor James Franco’s Bar Mitzvah Charity Event”; and “Kelly Rowland: Singer Shares Photos on Instagram of Her Son, Titan, From Trip to Pumpkin Patch.”

Facebook began experimenting with surfacing trending topics back in 2013, and the list has since become a fixture on the platform’s home page. Facebook now calls the feature “Trending,” with a capital T, and serves the list in the style of a journalistic news feed. Each viral dispatch Facebook deems Trending is distilled into an objective-sounding headline (“Ellen Pompeo: Actress Says on Social Media That Daniel Craig ‘Needs a Reality Check’ ”) and accompanied by a scientific-looking piece of digital flair—an ascending arrow symbol rendered in Facebook blue. But unlike the story feeds found on legacy news sites like CNN.com or NYTimes.com, Facebook’s stories are algorithmically calibrated to appeal to each individual users’ interests. The platform says the system “is personalized based on a number of factors, including Pages you’ve liked, your location and what’s trending across Facebook,” but it won’t volunteer every ingredient in its secret sauce. The user doesn’t know everything that the algorithm knows about her.

Advertisement

This much is clear: The algorithm knows that I am a bad person. Every time I log onto Facebook, the site serves me a selection of news items about celebrities behaving mildly provocatively (Kelsey Grammer wears an anti-abortion T-shirt), obscure reality television stars celebrating dubious milestones (Farrah Abraham gets her third boob job), and gruesome fates befalling real Americans (Maryland cop bites a guy in the balls). Facebook has sifted through my Web history and determined that I will consume the most worthless, voyeuristic, life-wasting online content available. Facebook says it shifts its algorithm based on user feedback, but every time I tell the site to stop showing me a celebrity PR stunt, it finds some other awful thing it knows will catch my eye. Yesterday, I told Facebook to hide a story about an unretouched photo Zendaya recently posted to Instagram. (“Why did you hide it?” Facebook asked me; I perused a list of preset responses and chose: “I don’t care about this.”) When Zendaya vanished, an eighth-grade student in Arkansas who died after passing out in gym class appeared in her place.

Facebook’s Trending algorithm is also sophisticated enough to determine that my colleague Dan Kois is slightly less worthless than I am. Here are the stories Facebook served each of us at about 11:30 EDT Thursday morning:

151022_USERS_Facebook-Trending-Screenshot

Facebook is not wrong about me. I am a true crime obsessive, and I’ve never met a janky pop-culture nostalgia slideshow I didn’t click through until the bitter end. But something about Trending’s quasi-journalistic sheen makes me feel more ashamed of my Internet vices than I ever was before. Most of the time, when I waste my life clicking mindlessly through the Web, it feels like the culture of the Internet is responsible for pulling me down dark alleyways filled with garbage content. But Trending’s personal touch implicitly blames me for clicking on the bait. It’s one thing to mindlessly scroll onto Channing Tatum’s Instagram feed and be confronted with a photograph of him dressed like a cartoon character. It’s quite another for the Internet to present the photograph to you as if it’s a thoughtful gift: We saw this and thought of you. And when an actually relevant news story appears on the Trending list—yesterday afternoon, an item about Corey Jones, a black man shot by a Florida cop after his car broke down, hit the No. 2 spot on my feed—the flash of meaning only heightens the mindlessness that surrounds it. By early Thursday morning, Jones had fallen off my feed, but Channing Tatum dressed as a cartoon bear was still No. 1.

The deeper you dive into the Trending page, the more distressing the situation seems. Users who click on a trending topic are whisked to a new Facebook page populated with posts from strangers, companies, and news outlets engaging with the keyword of the moment. Sometimes, the algorithm struggles to find any posts relevant to the subject at hand: When I clicked on a recent headline marking the 20th anniversary of ’90s teen flick Now and Then, I was shuttled to a page populated with news about the Beatles’ “Revolution” becoming available on VEVO and Victoria’s Secret model Candice Swanepoel celebrating her 27th birthday—probably because the movie title is too banal for the algorithm to reliably detect. Even more unsettling is when you click a Trending link and are confronted with a seemingly endless feed of virtually indistinguishable news reports on the same nonevent. Take the tale of the corgi puppy barking at a mini pumpkin, which has been drifting in and out of my Trending list for days. Elect to read more about the story, and you’ll find the same video reproduced over and over again, paired with banal sentiments like “This may very well be the cutest video you’ll watch all morning” and “I guess not everyone likes pumpkins!” As Facebook aggregates “trending” news stories and news outlets compete to publish stories that conform to Facebook’s “trending” topic, the viral content begins to feast on itself. One Connecticut CBS affiliate saw the corgi video trending online, aired it on television in a segment called “Trending Now,” then posted footage of the segment to Facebook. From there, the news story about the trending story populated Facebook’s Trending stream. Sometimes, the feedback loop circulates for so long that stories keep Trending after they’ve been sapped of all their relevance; this morning, some users were still being encouraged to celebrate Back to the Future Day, which was yesterday.

In some respects, Facebook’s Trending feature says more about the platform’s interests than its users’ passions. Yesterday’s top stories about Tatum, Cyrus, and Rowland were each posted directly to Instagram by the subjects themselves. Facebook owns Instagram. And when Facebook isn’t pointing users to another of its properties, it’s encouraging them to never leave at all. Just after the Democratic debate concluded the evening of Oct. 13, Facebook alerted me to three Trending “stories.” The first story was about a YouTube personality who “pranked” his girlfriend by pretending to blow up their 3-year-old son in a horrific explosion. The second story was about an Oregon guy who found a dead mouse in his Subway sandwich. The third was about a YouTube video of a rat fighting a pigeon originally uploaded over a year ago. None of these stories required the Facebook user to actually exit the platform or process any text—the media showing off the dead mouse, the combative rat, and the terrible father were all hosted on Facebook itself. Facebook isn’t incentivized to show users the type of content they want to see the most. It’s incentivized to show users the types of content that will keep them on Facebook.

Recently, Facebook acknowledged that its algorithms allow each user only so much control over the content that appears on her home page. The algorithm that populates a user’s central News Feed with posts from her friends was recently recalibrated to override the user’s stated preferences. In some cases, Facebook said, the system knows the user better than she knows herself. “Hiding something is usually a strong indication that someone didn’t want to see a particular post,” Facebook engineers informed users in July. But for those who elect to “hide almost every post in their News Feed, even after they’ve liked or commented on posts,” the algorithm will no longer “take ‘hide’ into account as strongly as before,” they wrote. So if a user likes a post about a celebrity spawn visiting a pumpkin patch, comments on a post about a celebrity spawn visiting a pumpkin patch, and then realizes, “Wow, I hate that I just did that,” she can instruct Facebook to show her less life-wasting content in the future, but the Facebook algorithm will not necessarily comply. All that’s left is the illusion that she’s making an effort to Internet better.

This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.