Jurisprudence

Get Stolen Naked Photos Off the Web

Congress needs to change the law to force Reddit and 4Chan to do the right thing.

A laptop.
Current interpretations of the law have given a free pass to the sites that enable hackers and trolls seeking to post stolen images.

Photo by icafreitas/Thinkstock

Here’s a contrast that makes no sense to me. Every day, movie and TV producers succeed in getting videos that have been posted without their consent taken down from major websites. Sure, you can still find pirated stuff if you look hard enough. But the big sites take down content once they know it’s been posted in violation of copyright. Because if they don’t, they’ll be sued—and no one will care if they defend the publication of stolen materials, in the name of free speech or otherwise.

Yet in the days since Jennifer Lawrence and other celebrities discovered that their nude images were stolen, and then posted or linked to without their consent on sites like Reddit and 4Chan, the stars can’t get the images taken down. They’ve made it clear they didn’t agree to these photos getting out. They’ve said the publication of the photos is a violation of their privacy. They’re victims, allegedly, of illegal hacking. Yet knowing all that, Reddit and 4Chan can continue to host the photos, or link after link to them, with apparent impunity.

This is crazy. Why should it be easy to take down Guardians of the Galaxy and impossible to delete stolen nude photos? If you think this is all about the hallowed First Amendment and the right to free speech, you’re wrong. If Reddit and 4Chan were forced to defend themselves on First Amendment grounds, JLaw would win. But they may not be forced to defend themselves at all, because Section 230 of the 1996 Communications Decency Act, or at least the courts’ sweeping reading of it, arguably allows them to publish these photos legally.

So, here’s the good news: Congress could go ahead, any time, and change Section 230 to give victims of involuntary porn the tools they currently lack to go after the sites that are profiting from their misery. Or the Supreme Court—which has never spoken directly on the matter—could reject the lower courts’ broad interpretations of the law. Either way, it’s past time to recognize that Section 230 has turned into a free pass for the sites that enable the hackers and the trolls. And as I’ve written before, fighting back—especially against porn that’s posted without consent—will not break the Internet.

To be clear, what I’m talking about is different from criminally prosecuting hackers. That’s perfectly justified, too, and as Amanda Hess deftly explains, a dozen states now have criminal statutes that can be used to prosecute hackers and trolls who post involuntary porn. These laws began passing after a man named Christopher Chaney hacked into the accounts of two women he knew, plus Scarlett Johansson and Mila Kunis. (Plenty of ordinary women suffer from involuntary or revenge porn, but legislators tend to pay attention to famous victims. This is deplorable, or useful, or both.) I’m glad that Chaney was sentenced to prison—let’s spread that news far and wide in hopes of deterring other creeps like him. But as this latest episode of celebrity exploitation makes clear, the threat of criminal prosecution is not enough. As long as sites like Reddit and 4Chan can support the hackers by distributing or circulating links to their ill-gotten gains—and reap in the traffic—the hackers will keep hacking, the photos will keep circulating, and the victims will find themselves with little power to stop any of it.

Here’s the current state of the law: Section 230, as the courts have read it, means that Internet service providers and many websites that allow users to self-post aren’t usually liable for their users’ content, even if it’s defamatory or privacy-invading. They don’t have to patrol for porn photos, and they don’t even have to take them down when the subject complains. It’s the second part that is maddening. Think about it: If these were photos of you, or your daughter or sister or wife, wouldn’t you want to sue the pants off a website that refused to take them down?

Instead, we are putting up with a rule that protects Reddit and 4Chan at the victims’ expense. I see two reasons for that. The first is the shaming of the subjects of nude photos that is still all too common. It’s not shameful to have photos like these or to privately share them. It’s shameful to steal them. But that’s a social lesson we’re frustratingly slow to learn.

The second rationale behind protecting websites when they host involuntary porn sounds more high-minded. It’s called the heckler’s veto, and it’s the fear that if any complaint is enough to delete involuntary porn, the Web will wither from censorship. You can hear this in a quote in Wednesday’s New York Times from Jillian York, a director at the Electronic Frontier Foundation: “While a rule against hate speech might prevent rape threats, it could also stifle political speech.”

There are other, blurry-line cases in which offensive content is worth protecting. But posting nude photos without consent has nothing to do with political speech, or any other kind of speech that’s of value. As law professors Danielle Citron and Neil Richards argue in an upcoming piece, “The public has no legitimate interest in seeing someone’s nude images without that person’s consent—celebrities included.” When involuntary porn is concerned, privacy rights easily outweigh the public’s right to know and to see.

And yet, civil libertarians like York reflexively argue that the best approach is to encourage sites like Reddit, YouTube, Facebook, and Twitter to self-police. Right, because they’ve shown themselves to be so responsible. This latest episode is just one more example in a long-running pattern: The sites won’t clean themselves up until they have a financial and legal incentive to do so. The Reddit forum that is hosting link upon link to images from the celeb hack is now scrambling to delete links to photos taken when subjects, like Olympic gymnast McKayla Maroney, were underage. That’s because Reddit doesn’t want to run afoul of child pornography laws. It’s not that Reddit is too shady to care about the law. It’s that there is no clear legal risk in continuing to host involuntary porn of adults.  

Here’s more on the legal ins and outs, from Vox, including the important point that Google also should be on the hook here. If the company wanted to bury the search results for involuntary porn, it could, and it has done that with other kinds of content, like mug shots. But Google gets to pick and choose which content to highlight and which to demote. The bottom line is this: When websites know they’ll be socked with lawsuits that they’ll lose, they will take these photos down.

This spring, one of my students at Yale Law School, Sopen Shah, wrote a paper about Section 230. She pointed out that when Congress passed the Communications Decency Act back in the 1990s, it had reason to worry about “stifling the Internet’s potential growth.” Nearly 20 years later, Shah argues, we are long past that. She’s right. It’s time to extend the protections copyright offers to victims of involuntary porn. These images are clear invasions of privacy, and it’s not hard to judge whether there is value in making them public (as it might be for a celebrity’s stolen emails, for example). Once a site like 4Chan is on notice that it is hosting nude or sexual images that a star like Jennifer Lawrence—or a person who is not famous at all—says she didn’t consent to distribute, the law should give that site every reason to take the photos or video down. This will not put free speech or the free Internet at risk. It will just give solace to people who clearly deserve that. And it will push viewers of porn to look at naked pictures of people who have consented to appear in them. Which are the opposite of hard to find.

Correction, Sept. 4, 2014: In several sentences, this piece originally misstated that Reddit “hosted” images from the celebrity photo hack. In fact, Reddit users posted links to the photographs, which were hosted on other sites, like Imgur.