Future Tense

Facebook’s Tone-Deaf Plan to Tackle Revenge Porn by Having Victims Upload Nude Photos

A Facebook employee holds a laptop with a “like” sticker on it during an event at Facebook headquarters.

Justin Sullivan/Getty Images

Update, May 22, 2018: Facebook is still running a variation of this pilot program, as it explains here.

In March, a private Facebook group of Marines with nearly 30,000 members was outed for hosting hundreds, potentially thousands, of explicit photos of female Marines and veteran service members without their consent. Only men were invited to join the group, called United Marines, and members were implored to share more photos of women, encouraged by lewd comments. One Marine in the Facebook group suggested under a photo of a women that the person who took the picture should “take her out back and pound her out,” according to a report from the Center for Investigative Journalism.

Now Facebook wants to make it harder for men to post nude photos of women without their consent—a practice often referred to as revenge porn, since the men often who do this are ex-partners attempting to damage a woman’s reputation or to impress their peers.

Facebook’s latest answer is to ask women to upload naked pictures of themselves to Facebook via Messenger. Facebook’s artificial intelligence software would then read the image and assign it a secret code, the Australian Broadcasting Corporation reported last week.

“They’re not storing the image. They’re storing the link and using artificial intelligence and other photo-matching technologies,” Australian eSafety Commissioner Julie Inman Grant told the Australian Broadcasting Corporation. “So if somebody tried to upload that same image, which would have the same digital footprint or hash value, it will be prevented from being uploaded.”

Facebook hasn’t written a public blog post announcing this new pilot program. And it’s not clear if this only works with a single photo at a time, meaning you’d have to upload every photo you are afraid will be leaked in order for it work. Nor is it clear whether you could upload images after someone has started to share them online to stop them from being spread further. Slate reached out to the company for clarification, and we will update once we hear back. (Update, Nov. 9, 2017: Facebook has now released a blog post describing the pilot program, and Alex Stamos, Facebook’s chief security officer, has addressed some concerns about the project on Twitter.)

Facebook is partnering with the Australian eSafety Commissioner, a federal agency for online safety education, to test this new approach. If someone contacts eSafety with a complaint, Inman Grant said, the agency may recommend they try Facebook’s new nude photo–blocking algorithm to prevent future nonconsensual sharing of nudes on the social network. Australia is one of four countries Facebook is working with on this pilot program—the other three countries have not been publicly shared.

Facebook’s goal here is laudable. But there are some clear problems. For one thing, if someone is able to hack into someone’s Facebook account (like say, if the person is already logged in or the password is stored in the browser), could the image be retrieved? Grant says Facebook won’t store the images, but few things online are ever truly permanently deleted.

More fundamentally, though, asking women who have been victims to upload naked photos of themselves is a rather tone-deaf approach, one that’s not particularly trauma-informed. When a naked photo of a person is circulated without her consent, it can be ruinous emotionally and professionally. Requesting that women relive that trauma and trust Facebook, of all companies, to hold that photo in safekeeping is a big ask.

Facebook, after all, is one of the primary places where these images are shared without consent. The company has been sued multiple times by women for hosting revenge porn, including a case that was allowed to move forward last year involving a 14-year-old girl in Belfast, Northern Ireland, who says a naked picture of her was posted on a “shame page” on Facebook. In the United States, 4 percent of internet users—about 10.4 million people—have fallen victim to threats or have experienced explicit images of them posted online without consent. For women under the age of 30, that figure reaches 10 percent, according to a 2016 study by Data & Society.

The new pilot in Australia builds on a set of global initiatives rolled out earlier this year that gave users an option to report if a “nude photo of me” is posted. It also launched a system that is supposed to prohibit further sharing of banned photos. Facebook, to its credit, is working with experts in online civil rights and domestic violence to build these tools.

But if Facebook wanted to build a software tool to combat revenge porn without accidentally flagging photos that are in the public domain or are of historic significance—like when it wrongly blocked the iconic photo of the nude girl in the Vietnam War—it’s probably going to have to be more complex than this. Perhaps once an algorithm recognizes when a photo of a partially clothed body is uploaded, it should then run facial recognition software on the photo. If it detects that this is possibly another Facebook user, then it should flag the image to be reviewed by a professional who works at Facebook. It might mean forcing a lag time on any photos that appear to include nudes, and users will just have to get used to this new restriction.

Whatever the solution is, it shouldn’t ask women who are afraid of abuse to make themselves feel even more vulnerable. In theory, Facebook’s experiment here could work. But after dealing with the fallout of learning you’re a victim of revenge porn, the last thing you probably want to do upload a naked photo of yourself to Facebook.