Users

“Please Do Not Downvote Anyone Who’s Asked for Help”

How Reddit is changing suicide intervention.  

Man on Reddit
After early Reddit power users started noticing people roaming the site, leaving comments about being suicidal and depressed, they launched dedicated subreddits to give users the space to share their feelings and connect with one another.

Photo illustration by Slate. Photo by Shutterstock. Logo via Reddit.com.

There is a Reddit forum called SuicideWatch that feels a little bit like a digital version of those telephone crisis hotlines—if all the volunteers took off early for the day, leaving the depressed to start answering each other’s calls. On SuicideWatch, an anonymous poster logs in and airs his feelings, like: “I just want to fucking die.” Other anonymous users scroll through the messages, upvote some to the top of the thread, downvote others into obscurity, and weigh in with advice in the comments: “Our bodies aren’t so simple, they are a mess of chemicals and electricity that wreak havoc on our minds. … This shit gets smoothed out with time.” If all goes well, the poster and the commenter weave the thread into a little strand of human connection. “I really hope you’re right,” the poster replied in this case. “I’ve been this way for 7 years (I’m 19 now) and it’s getting harder and harder to snap out of it.”

Five years ago, a software developer named Laura stumbled onto SuicideWatch. She’d experienced suicidal thoughts as a young woman, lost two people close to her to suicide, and had volunteered for more than 15 years on a suicide hotline. So her initial reaction to an online forum dedicated to ad-hoc, largely untrained suicide interventions was: “Oh my God, I have to get this shut down.”

Now, she serves as one of SuicideWatch’s half-dozen-or-so moderators. She checks in on the subreddit two or three times a day—to reply to posters with encouraging words, remove unhelpful comments, field private messages, link curious users with mental health resources, or warn the community about some new troll that’s stalking the page. “Reddit is a terrible place to try to do suicide intervention,” Laura, who asked me not to use her last name because of her role at the hotline, tells me. “The subreddit doesn’t exist because anyone thinks it’s a good idea to do suicide intervention here. It exists because there are people who only feel comfortable here. And they need to talk to somebody.”

The first crisis hotline was born in 1953, when an English vicar named Chad Varah launched the Samaritans, a phone-based counseling service designed to meet people wherever they happened to be when they needed help. A suicidal person in 1953 might not have been capable of reaching a doctor’s office or a confession booth at 3 a.m., but he could usually reach his telephone. Now, it often feels easier to shoot off an email or a text than actually pick up the phone to make a call. But while traditional hotlines are starting to integrate digital elements into their services, the bulk of the work is still performed over a telephone line. When you Google “i want to die” in the U.S., the search engine spits back the phone number for the National Suicide Prevention Lifeline (it’s 1-800-273-TALK). The lifeline’s Twitter handle, established in 2009, is @800273TALK. Every couple of days, it tweets at its 30,000 followers with a platitude like “Your life matters” and a reminder of the number to call if you need help.

It’s not that the suicide hotline doesn’t want to go digital. John Draper, a psychologist and the lifeline’s project director, says that the lifeline began dipping its toes into online chats and text-based services about five years ago, after people in distress began flooding the site with emails. Since then, progress has been steady, but slow: The lifeline is a network made up of 165 local call centers, and of the 134 centers that responded to a recent survey, just 36 reported having a text-based service available, and 45 reported hosting an online chat. Many crisis centers are still struggling to fund their basic phone lines, so they don’t have the cash to expand into spiffy new online services. When I tried chatting from a New York ZIP code last week, I filled out a form, then got plunked into a chat room where I was instructed to wait for a counselor to arrive; when I tried chatting from a ZIP code in rural Kansas, I was just told: “The Chat is currently unavailable. Please try back later.”

So, like with any other in-demand consumer service, tech disruptors are beginning to stake a claim on suicide intervention. In the New Yorker last month, Alice Gregory profiled the Crisis Text Line, the first-ever resource of its kind to conduct conversations exclusively via text message. The service, launched by the people behind the youth activist campaign organization DoSomething.org (tagline: “makes the world suck less”), is designed to attract teenage digital natives who may be “too scared to make a phone call” but feel comfortable enough to text about their troubles “between classes, while waiting in line for the bus, or before soccer practice.” Nancy Lublin, CTL’s CEO, sees the crisis line as a tech company in the vein of Airbnb or Lyft: think Uber, but for suicide prevention.

Curiously, these emerging digital services have largely failed to connect with the Internet’s most hardcore demo: young men. The Crisis Text Line estimates that 70 percent of its mostly teenage texters are female, based on the first names and pseudonyms they use in conversation. Draper says that women only slightly outnumber men in calls to the lifeline’s traditional phone number, but on its online chats, “the majority of people are teenagers, and they’re mostly girls.” Meanwhile, men in the U.S. are four times as likely as women to die by suicide, and they’re less likely to ask for help before it’s too late.

* * *

A couple of years ago, a mini-controversy erupted on Wizardchan, an online forum for reclusive male virgins. The site’s name is borrowed from a viral joke—the punch line is that when a man reaches age 30 without having sex, he’ll acquire magic powers—but the premise has amassed a base of sincerely dedicated users who go to the forum to comment on boards dedicated to hobbies, random thoughts, anime, and depression. After seeing that some posters in the depression board were discussing suicide plans and self-harm, an administrator pinned a crisis hotline number to the board, encouraging users to pick up the phone if they were truly at risk.

Wizards were offended at the suggestion. This was a bunch of guys who had built a community around their own outsider status, and now some authority figure was stepping in to tell them that they had problems that needed to be fixed in the most conventional way possible: by calling a 1-800 number. The number was eventually removed from the board, and when one user recently suggested that Wizardchan bring it back, another explained, “I think most people here are over normie advice, talking to some random guy who doesn’t understand where you’re coming from on the phone isn’t going to help them.” Another user said that hotline workers are just “not trained to handle the problems we face as [wizards]. It’s kind of sad how society treats us.”

This is how traditional suicide hotlines work: When you have no one you think you can talk to, just key in the right combination of numbers into your phone and connect with a sympathetic ear. Online forums that facilitate conversation around depression and suicide work the opposite way: You’ve already found people who seem to understand you. So why not talk about your mental health? Soon after Reddit launched in 2005, early power users started noticing that people were roaming around the site, leaving comments about being suicidal and depressed. So they launched dedicated subreddits to give users the space to share their feelings and connect with one another. Now, nearly 35,000 people are subscribed to SuicideWatch. Related subreddit, or sub, StopSelfHarm has 4,500 members; MakeMeFeelBetter has 15,000; the depression sub, which shares some moderators with SuicideWatch, has more than 100,000 subscribers. SuicideWatch provides users a link to hotline numbers in a sidebar, but it doesn’t overemphasize the resource. “The percentage of people who are posting here because it has never occurred to them to call a hotline is basically zero,” Laura says.

Reddit’s user base is 65 percent male, and Laura suspects that the posters who flock to SuicideWatch are demographically similar to those of the site as a whole. On the forum, users open up about their wasted sex lives and their lost jobs, their divorces and their unrequited loves, their issues with their bodies and their dads. One SuicideWatch poster expressed guilt over committing a rape; another, distress over being falsely accused. In the wide-open Internet, many of these posts would be placed into politicized camps and denounced as either misogynistic or pathetic, but on SuicideWatch, they’re embraced. After talking about suicide, users click over to a pit bull subreddit to share their expertise, or to a pro wrestling sub to argue over the previous night’s match. 

The Web presences of suicide prevention organizations typically adopt a blandly feminine aesthetic. Tumblr Suicide Watch, a service run by anonymous volunteers who keep an eye on Tumblr users who express interest in suicide or self-harm, is illustrated with a swarm of purple butterflies. The National Suicide Prevention Lifeline encourages friends of depressed people to send out supportive e-cards, like one featuring a photograph of two people forming a heart with their hands. The lifeline also maintains a Tumblr called You Matter, which churns out generic inspirational phrases (“be gentle with yourself, you’re doing the best you can”) printed over pinnable floral imagery or alongside cutesy cartoon birds. Meanwhile, the dark corners of the Web are littered with pro-suicide sites—featuring jokes, games, violent GIFs, and cruel memes that mock victims—that have a decidedly masculine sheen. In a 2012 paper published in New Media & Society, Swedish academic Michael Westerlund argued that pro-suicide websites, with their repetition of “technical, chemical and anatomic details,” their expression of “clearly individualistic ideals,” their “descriptions of morbid violence towards the body,” and their “absence of emotions,” function as hallmarks of Western masculinity.

Depressed man
Men experience a particularized stigma in talking about their feelings.

Photo illustration by Slate. Photo by Shutterstock.

Reaching men who have been conditioned to relate to violent GIFs more than purple butterflies has always been a challenge for suicide prevention experts. In 2012, the Colorado Department of Public Health and Environment launched the website ManTherapy.org, a kicky crisis portal hosted by a fictional, mustachioed therapist named Dr. Rich Mahogany, a character created in the mold of two Rons—Swanson (of Parks and Rec) and Burgundy (of Anchorman). This is a mainstream campaign, created in collaboration with a Colorado suicide prevention foundation and a marketing firm and focusing on men in middle age; probably not appealing to a typical wizard. But the social science research that informs the project illuminates a few reasons Web-based communities might be particularly appealing to men in crisis.

The researchers behind Man Therapy note that men experience a particularized stigma in talking about their feelings, so they tend to worry more about whether their problems are “normal” before they’re willing to discuss them openly. The more they “believe other men experience the same problem,” the more likely they are to reach out for assistance. Men in crisis also tend to bristle at the seemingly one-way relationship between depressed people and mental health professionals, and the perception that the “expert” worker maintains “a position of power” over the person seeking help. “For some men,” the researchers found, “receiving help is acceptable only if they can return
 the favor later on.” Depressed and suicidal men may also be seeking “a chance to assess and ‘fix themselves.’ ” As one male interview subject told the researchers: “Show me how to stitch up my own wound like Rambo.”

* * *

Through SuicideWatch, I met a 28-year-old guy from Ohio who has tried and failed to meaningfully connect with volunteers for traditional crisis hotlines. “I’ll say a few words and invariably hang up,” he told me. “Sometimes I don’t even say anything. I just feel like the other person on the line doesn’t really care, as silly as that may sound, and I don’t want to unload a bunch of crap onto them that is ultimately pointless.” To him, there is also a transactional nature to the exchange that is off-putting: “I often feel like they exist so that people can say ‘Call the suicide hotline’ rather than deal with the person,” he told me. “A lot of the time when I talk to people with professional qualifications, I feel like all I’m getting is a script and the person isn’t interested or invested in me as a person. At least I know the person who is doing this out of the goodness of their heart is genuinely invested in me as a person. Does that make sense?”

It’s typical for depressed people to feel like no one understands them and that no one can help. It’s not just wizards and redditors who feel alone, and suicide hotline volunteers are trained to connect with all stripes. But for people who are reluctant to discuss their emotions out loud—whether to a hotline volunteer, a therapist, or a friend—the immediacy of an online forum can lower the barrier to entry.

And on SuicideWatch, users will find more traditional help than they perhaps initially expected. The democratic sheen of Reddit is part of its appeal—upvote the good, downvote the bad—but SuicideWatch’s moderation policy is different. “The problem we have on an online public forum is that most of the members of the general public have the best intentions in the world, but they pretty much do everything wrong, all the time,” Laura says. So SuicideWatch has settled on a moderation regime that’s much stricter than most other corners of Reddit. People who write in asking for help can say almost whatever they want. But users who reply to requests have to avoid a laundry list of missteps: anything that reeks of “tough love”; guilt-trippy comments like “suicide is selfish”; pro-suicide sentiments or explicit discussion of suicide methods; “religious proselytizing”; advocating for specific therapies or drugs; posing too aggressively as a “role model” for other depressed people to follow; spamming the sub with generic uplifting comments; or, of course, encouraging others to commit suicide. If you position your cursor in preparation to vote a post down, a little conscience bubble pops up to plead: “Please do not downvote anyone who’s asked for help.”

When I asked the National Suicide Prevention Lifeline’s Draper what he thinks about amateur communities that perform suicide intervention, he expressed interest in the idea, with a few caveats. “There used to be less of a divide between what we now call ‘experts’ and people who are just trying to help others,” Draper told me. The hotlines themselves constitute an alternative to traditional mental health care; instead of reaching out to a doctor or therapist, you call a trained volunteer. “As experts, we need to be doing a much better job of equipping people with what we call ‘psychological first aid’—the skills to say and do the right things to keep a person in crisis feeling supported, safe, and in the most extreme scenarios, alive.” An online forum can be a good resource, Draper says, as long as it employs moderators with expertise “who can be a part of those conversations—not necessarily leading them, but modeling for people how to respond.” Moderators like Laura—who has deep experience as a crisis hotline volunteer, but also works in IT and was a Reddit early adopter in 2005—serve as a kind of bridge between traditional suicide prevention and fringe Internet culture. “I don’t always disclose my hotline affiliation on the sub,” she says. “Sometimes, I’m just a person.”

In many ways, these online boards are providing a similar service to traditional hotlines. Draper says that only about 25 percent of people who call the lifeline do so to report suicidal thoughts; the rest are phoning to “express emotional distress” or “just find someone to talk to.” Still, the Internet offers up a more diverse array of conversation partners who can serve people of every subgroup. Dawn Brown, the director of the National Alliance on Mental Illness’ HelpLine—a service that connects people with mental health resources—says her organization “places a high value on peer support as one piece of the recovery puzzle.” Kate Mallow, a member of the HelpLine staff, says that online communities “can help you find a peer that you can relate to—and you can’t find that on a suicide hotline.”

SuicideWatch feels unique not just because of its gender mix (women contribute, too, but they don’t appear to dominate) but also because of the balance it strikes between the professional and the amateur. On Wizardchan, anonymous, depressing thoughts—“i’m scared to get a job because i don’t want a co-worker to take a photo of me, or a photo with me in the background, and post it online”—just pile up one after the other, with few constructive responses breaking up the feed. And on Twitter and Facebook—where people in crisis sometimes go to post about feeling down, useless, or even suicidal—messages can just hang there. When a 30-year-old man logged onto Facebook and left a suicide note as a status update in 2009, hours before he killed himself, one of his friends replied:  “Are you dying? or just staying [in] brooklyn? I hope it’s the latter.”

Since social media emerged as a popular platform for airing depressed and suicidal thoughts, tech companies and suicide prevention experts have sought out technological solutions. Samaritans Radar—an app designed to crawl Twitter to detect potentially suicidal speech, and sponsored by the organization that created the original crisis hotline—launched then collapsed quickly last year. Some users complained that the app was mostly catching facetious references to suicide and that jokesters shouldn’t have to deal with concern trolling. Others said that the app would end up shining an uncomfortable spotlight on the people who were sincerely discussing mental health on social media: For people who use social media to reach out when lonely or depressed, getting caught by a robot can end up feeling more alienating than supportive.

Courtesy of Facebook

Facebook just rolled out a new feature that lies somewhere in between suicide bot and concerned friend. If you see a friend who may be referencing self-harm or severe depression, you can flag the post, and Facebook will shoot off an e-card to the troubled party that reads: “Hi _____, a friend thinks you might be going through something difficult and asked us to look at your recent post.” Facebook then gives the user the option to “Reach out to a friend or helpline worker,” or to “Get tips and support” to “work through this.” The message comes illustrated with a bubbly white question mark inside a Facebook-blue heart, as if to say: “We may not understand you, but we love you.”

* * *

Harry is an 18-year-old British art student who stumbled on SuicideWatch just a couple of weeks ago. For a few years, he’s been using a Dictaphone to record his struggles with depression and self-harm; the device makes the experience of talking to himself “feel a lot closer to counseling” when he can’t actually see a professional face to face, he told me. When he found SuicideWatch, he posted one of his recordings there and quickly heard back from other people who shared similar feelings. “I had regularly felt like I was the only person with my difficulties,” he told me. “Even sharing these experiences with counselors, I felt alienated.” In turn, other Reddit users can feel less alone by listening to Harry’s tape.

Laura gets a lot of messages saying, “I’ve only ever lurked here, but it really helps me.” Some people “end up giving help, and that helps them,” she says. “If you’re standing on the edge, the person who can reach out a hand to you is not standing that far away.” In 2010, University of Alberta psychologists released a study of a community message board for adolescents considering suicide. The board was monitored by adult volunteers but authored primarily by teenagers in crisis. The researchers found that many posters “who began as help-seekers,” writing about their suicidal thoughts, soon “grew into new roles as help-providers,” offering emotional support and guidance to other teenagers. That dynamic, the researchers suggested, could reproduce one central benefit of traditional support groups: Participants who both offer and accept support end up reporting greater well-being than those who just give or take.

There are, however, complications with the depressed leading the depressed. Laura told me that one reason some SuicideWatch posters reject suicide hotlines is because they worry the encounter will result in an involuntary intervention—but perhaps in some instances that intervention is essential. Posting suicidal thoughts on Facebook or Twitter is likely to spark concern not just from strangers, but also from family members and friends, and that might be a good thing. Meanwhile, on a totally anonymous forum, “a user might notice your message and say the right thing, and that may feel very special,” Draper says. “Then again, maybe no one will.” When a post on Reddit gets no comments, the site leaves an ominous automatic message: “there doesn’t seem to be anything here.”

In other cases, these communities risk providing almost too much support. While identifying with a group of like-minded people can be the first step toward recovery, identifying too closely with depression can hinder progress. Draper says that when suicidal people flock to online communities to feel more normal, they’re “creating a risk, because finding a lot of people who feel similar to you is different from saying, ‘I need help.’ ” There haven’t been too many studies of suicide-related Internet use, but a review of the existing literature, published last month in the Australian & New Zealand Journal of Psychiatry, found that “informal online suicide communities” often “maintain suicidal feelings rather than lead to recovery.” Even as one studied forum “offered sympathy, acceptance, and encouragement to continue living” to users, it also “maintained, rather than transformed, users’ suicidality.” For a 2012 paper published in the journal Social Psychiatry and Psychiatric Epidemiology, researchers compared messages left by trained volunteers and laypeople on an online support group called the Israeli Association of Emotional First Aid, and found that while laypeople were adept at responding with “empathetic and encouraging” responses, volunteers were trained to pair emotional support along with more sophisticated responses designed to spur growth and encourage change in suicidal users.

And just as experts have raised concerns about digital suicide notes proliferating on the Internet, discussions of suicide plans—no matter whether they’re actually carried out or not—can constitute dangerous material. “The research is very clear: That is harmful information. In fact, it can aggravate and spur other people to model the behavior,” Draper says. “This type of media can create contagion in clusters and communities, and so that should be removed and avoided.” The SuicideWatch sub technically prohibits all discussion of specific suicide methods, but deleting a cry for help is easier said than done. “We try really, really hard to not take down anything that comes from anyone who genuinely reached out to get help,” Laura says. Posters sometimes describe their plans in some detail, occasionally while they’re in the process of carrying them out; it’s not unusual for a poster to describe how they plan to attempt suicide, then return to detail how the attempt panned out.

Meanwhile, the public nature of forums can attract others who don’t mean so well. In the early aughts, a British nurse named William Francis Melchert-Dinkel posed as a twentysomething depressed woman on an early Usenet group dedicated to discussing suicide, then encouraged people in crisis to kill themselves; at least two people did.  “Some days when the trolls and haters descend, I wonder if we should even try to keep doing what we’re doing, but that’s not a new thing,” Laura says. “I think anybody who gets involved with suicide intervention should ask themselves at least daily whether they’re doing the right thing and for the right reasons. …  So far, it seems to make more sense to keep going than to stop.”

After reviewing the literature, the Australian & New Zealand Journal of Psychiatry concluded that while online suicide communities are “unable to replace professional services in terms of promoting recovery among people who are suicidal,” they “do not necessarily pose a risk to participants.” In fact, they “can offer valuable support by providing users with a place to share problems and feel accepted and understood.”

An online message board is sometimes the most accessible resource to a person in crisis, but it doesn’t have to be the only one. Liam, an 18-year-old from Vancouver, British Columbia, started posting on the Depression subreddit after being diagnosed with depression. While his doctor prescribed him medication, Reddit offered him “the comfort of knowing that I wasn’t alone in my struggles,” he told me. Liam recently posted what he hopes will be his final note on the board: “I am both sad and happy to say that I no longer need this sub,” he wrote. “My life is going great, for the first time ever. I feel alive.”