Despite the ongoing problem of online harassment, most of us go online these days with the reasonable expectation that we can look at baby pictures on Facebook without being surprised by a gruesome photo of a beheading, or search for music videos on YouTube without running into a videotaped rape or animal abuse. It would be nice if this were all because everyone online was behaving themselves, but as Adrian Chen found in a recent piece for Wired, there is an army of laborers who spend their days cleaning up the Internet so the rest of us can use it in relative peace.
"As social media connects more people more intimately than ever before, companies have been confronted with the Grandma Problem," Chen writes. "Now that grandparents routinely use services like Facebook to connect with their kids and grandkids, they are potentially exposed to the Internet’s panoply of jerks, racists, creeps, criminals, and bullies." The solution is to hire workers to spend all day combing over sites that depend on user-generated content to take that stuff down.
The army of comment moderators is huge, "well over 100,000," Hemanshu Nigam, the head of online security firm SSP Blue, told Chen. Most of them work overseas, though many American companies do have squads in the United States to handle stuff that needs a little more cultural context. Chen visited offices in the Philippines, where people are paid a few hundred bucks a month to sift through the garbage people post online.
Sitting at a computer all day seems, at least, to be a better deal than working in a factory for similar pay, but as Chen discovered, the psychological toll of this work is immense. A lot of the job involves taking down pornography, of course, but a lot of it is taking down hateful, sadistic, and terrifying stuff. (There's also a lot of overlap.) "The worst was the gore: brutal street fights, animal torture, suicide bombings, decapitations, and horrific traffic accidents," Chen writes, regarding one American who took the job thinking it would be easy but had to quit because it began to get to him. Most people just aren't capable of looking at torture, rape, and animal abuse all day and shrugging it off. A psychologist who works in the Philippines likened the problems the moderators had to PTSD. One woman in particular is "especially haunted" by a video she took down, about half an hour long, that appeared to be of a man raping a teenage girl who she described as "blindfolded, handcuffed, screaming and crying." Seeing that just once would be hard to get over, but having to deal with that stuff daily seems impossible to fathom.
Back in the 1990s, the rise of the Internet created a porn panic, as parents and politicians worried that the new technology would make it all too easy for kids to naked people having sex. The Communications Decency Act, which strictly regulated the distribution of "indecent" materials online, was passed in 1996 in response to this fear. The law was a drastic overreach and much of it was soon overturned in courts.
Since then, the worst fears about porn have come to pass, and anyone can see hardcore porn any time of day from anywhere. But it turns out that ubiquitous porn was the least of our problems. Far more upsetting is the way that the Internet allows people to share "the infinite variety of human depravity," as Chen puts it. The gore, the sexual abuse, the animal torture, the hate speech, the harassment? It all makes watching a video of a consensual sex encounter seem downright wholesome.
Of course, even the consensual porn gets to the moderators after awhile, because workers "feel desensitized" watching porn all day and some suffer sexual dysfunction. But, for good and bad, they still aren’t desensitized to the rest of it. "They begin to suspect the worst of people they meet in real life, wondering what secrets their hard drives might hold," Chen writes of the moderators. Some women are so afraid that they refuse to hire babysitters anymore.
Chen brings us inside a world, and introduces us to a job, that most of us didn’t know existed. He does not, however, offer any solutions, probably because the problem is massive and defies easy fixes. One thing is certain: Setting aside a group of people to absorb all the psychological impact for the rest of us is not good enough. The Internet has been mainstream for a couple of decades now. It's time to stop treating it like it's a fantasy landscape, accept that it's an extension of real life, and start imposing some of the controls on the web we use in real life, such as tying a person’s online behavior to his or her real identity and having law enforcement intercede when things get out of hand. If we can't bear to do that just yet, then at least let’s pay the people who are tasked with cleaning the place up for the rest of us a lot more money.