Technology

Does Reddit Have a Transparency Problem?

Its free-for-all format leaves the door open for moderators to game a hugely influential system.

When Reddit users get banned from a subreddit, they have little recourse—and that’s only after they’ve figured out they’re banned.

Photo illustration by Slate. Photo by Eric Cote/Thinkstock.

Depending on how you look at it, Reddit is either a miracle or a waste pile. The site is one of the most popular on the Internet, attracting nearly 200 million unique users every month from around the world to its 8,000 topic-based communities (or “subreddits”). Alexa ranks Reddit as the 16th most popular site in the United States—ahead of Instagram, CNN, and Netflix—and the 47th worldwide. Owned by Condé Nast parent Advance Publications, Reddit has become the hub for news aggregation, discussion, and sharing between strangers, as well as a core promotional tool for the media. And Reddit has hosted a lot of great, loose discussion, from an endless stream of Ask Me Anything, or AMA, interviews to story-sharing AskReddit threads such as “What is the creepiest thing a child has ever said to you?” to everything you ever wanted to know about bitcoin. When President Obama did the biggest AMA of all time in 2012, he said it was an “an example of how technology and the Internet can empower the sorts of conversations that strengthen our democracy over the long run.”

Ironically, though, Reddit itself is less a democracy than a fiefdom system of autocrats. The people who run subreddits are not accountable to the public, so you have incidents like the Fappening (in which links to stolen nude photos of celebrities were eagerly disseminated), the nasty reign of jailbait porn troll violentacrez, and the stalking of an innocent Brown student and his family after the r/FindBostonBombers subreddit misidentified him as a Boston Marathon bombing suspect. Even setting aside such crowdsourced scandals, Reddit faces more subtle problems of accountability. Reddit mostly goes unpoliced, preferring to react to scandals instead of preventing them, and that lack of transparency leaves the door open for corrupt or simply incompetent moderators to game an enormously influential system.

In its embrace of community moderation standards, Reddit resembles Wikipedia, but Wikipedia aims to speak in a single voice with a “neutral point of view” backed by “reliable sources”; users and editors publicly debate how well an article follows these ideals (and others) on its “talk” page. It’s not perfect, but Wikipedia deserves credit for working out a coherent set of best practices. Since Reddit doesn’t aim at a particular type of content but just discussion, anything goes. Reddit’s thousands of subreddit forums each police themselves under the auspices of self-appointed volunteer moderators and only the barest set of global rules: no spam, no doxing, no child pornography. “Every Man Is Responsible For His Own Soul,” Reddit CEO Yishan Wong wrote in a blog post explaining why Reddit had been reluctant to ban the r/TheFappening subreddit outright. (It then banned it anyway, since Reddit also reserves the right to do whatever it wants.)

Reddit’s problem is that beyond its basic rules, its operations are entirely opaque. Reddit embraces a core principle of the Wild West: finder’s keepers. Whomever starts a subreddit becomes its almighty moderator and appoints any other mods; within a subreddit, a moderator can arbitrarily remove comments and ban users from that subreddit—including “hellbanning” or “shadowbanning” users, which means they don’t even realize they’ve been banned because their comments continue to show up just for them. There are no disclosure requirements, and Reddit’s FAQ politely tells would-be reformers to go jump in a lake: “Moderators are free to run their subreddits however they so choose. … So consider making a new subreddit and shaping it the way you’d like rather than performing a sit-in and/or witch hunt.” If the mods of the massive subreddit r/technology have a No Auerbach policy and ban me and any mention of me, I don’t have much recourse other than to start r/TechnologyWithAuerbach and “compete” against them, hoping enough people share my outrage at their No Auerbach policy.

Earlier this year, Reddit demoted the previously front-page r/technology after the Daily Dot reported that any posted headlines were automatically banned if they contained certain words, including NSA, Bitcoin, net neutrality, Tesla, and Snowden. Reddit only took action after user sleuths charted the total absence of the words and after the Daily Dot published the results. The 3-million-user-strong r/politics had an uproar last year after moderators banned links from dozens of news sites deemed to represent “bad journalism,” including respectable sources like Mother Jones and Reason. Instead of going off and forming r/politics2 in protest, members instead raised a major stink and got the moderators to reconsider. The ban list is considerably shorter now, though it still includes sites like Salon and Boing Boing. As to the original motivations of the r/politics mods—quality control? political bias? a pro-Slate agenda?that’s still an unanswered question. The Daily Dot’s chronicles of moderator power drama read like something out of The Sopranos or Boardwalk Empire.

But what if users don’t know what’s going on? Last year, Reddit employee Marta Gossage announced that several moderators of hugely popular Reddit porn communities like r/nsfw were globally banned for allegedly accepting bribes from a spammer. Here’s the thing, though: Reddit only caught the dirty mods because they’d negotiated the bribes via private Reddit messages. Having proved that it can catch mods dumb enough to take payoffs over its email system, Reddit has not provided much reassurance that smarter mods are squeaky clean. Over several years, Ian Miles Cheong became a moderator of several huge forums, including r/politics and r/AskReddit, then got banned not only for posting large amounts of promotional spam but for having taken a job with the marketing agency of news site GlobalPost as a “social media consultant.” Again, not a Sherlock-level case: Plenty of moderators work in industries related to the subreddits they moderate for free, whether it’s stocks, marketing, porn, or Web hosting. Their communities will call them out if they’re obviously shilling, but is community policing sufficient? If I were a moderator on r/philosophy and Arthur Schopenhauer quietly paid me to play up his philosophy and play down Hegel’s, and I was careful about it, would anyone notice? If I banned a couple vociferous Hegel fans while letting the Schopenhauer fans run wild, could people tell I was biased rather than just exercising good judgment?

Reddit is not even clear that some of these behaviors are bad. What r/technology did in banning certain news outlets wasn’t even against Reddit’s policies, even though it amounted to wide-scale suppression on one of the most popular subreddits. Reddit demoted r/technology not for squelching speech or bias, but merely for lazy moderation (and for making Reddit look bad). And when users get banned from a subreddit, they have little recourse—and that’s only after they’ve figured out they’re banned.

Reddit’s success points to its value as a recreational community site that can generally stay out of trouble. Yet the clean mods—the vast majority, no doubt—still suffer under doubts (and user grief) because the system is so opaque. Since the above cases hardly show all that a smart mod could achieve, it seems to be only a matter of time before Reddit falls under much more suspicion.

Many moderators do their somewhat thankless jobs out of sheer love, but Reddit’s lack of transparency frequently makes their motivations difficult to divine. The deal that Reddit gives mods is that in exchange for the drudgery of administrating and policing a subreddit, they are given unchecked power over it. That may seem like a fair or even generous trade, but it is guaranteed that some will try to abuse that power. (If you’re not convinced, read The Federalist 51.)

There are a number of potential reforms available, which could be applied to all subreddits or just the more popular or influence-prone ones. Moderator bans and links could be formally publicized, either broadly or in detail, to better prevent even the appearance of conflicts of interest. Subreddits could be required to set explicit policies for content and moderation rather than making ad hoc decisions. And the company itself could definitely take a greater role in leading a Wikipedia-like establishment of further standards. Anarchy could still be permitted in some forums, in much the same way that adult content is permitted only on some forums. But as the above examples show, the largest subreddits, with millions of members, are becoming too central to public discourse to leave in the hands of capricious autocrats. Reddit’s reactive attitude to its scandals offers little confidence that it’s ferreting out trouble. Reddit clearly does not want to mess with what seems to be working well, but if the stories above are, as I suspect, just the tip of the iceberg, Reddit will save itself a lot of trouble by embracing transparency now. 

Update, Oct. 13, 2014: After this article was published, it was submitted to several subreddits. It was removed from r/politics (because it was deemed off-topic) and from r/news (no reason given). The case of r/technology is more curious: A reader told me over the weekend that links to my article had disappeared, and I was unable to find a r/technology link to my article on the afternoon of Oct. 12. Today, the link to my article is present, but Reddit is inconsistent with regard to the article’s posting time and upvotes (e.g., 24 upvotes and posted the morning of Oct 10, vs. 16 upvotes and posted the evening of Oct. 12). This pattern suggests either a bug or administrator interventions. The article has been heavily discussed on r/undelete, which tracks Reddit deletions, and on r/conspiracy.