Medical Examiner

The Danger of Reading the Comments

People trust supposedly credible online sources even for health decisions.

Confused at computer.
“This guy says he’s a doctor, so maybe he’s right.”

Photo by Monkey Business Images/Thinkstock.

The best account on Twitter, without question, has just over 40,000 followers and doesn’t tweet anymore. It’s called Don’t Read Comments (@avoidcomments), and in its heyday it periodically tweeted things like “Nobody on their deathbed ever said, ‘I wish I had spent more time reading internet comments’ ” and “If you spent more time applying yourself and less time reading comments, you could afford HBO and wouldn’t have to pirate Game of Thrones.” Those were sage words. With the obvious exception of Slate’s sensible and insightful commenters, most people who write Internet comments are bonehead nimrod thumb-suckers whose collective achievement is to make a chillingly compelling case against the First Amendment.

But despite the yeoman’s work of hero tweeters like Don’t Read Comments, people do read comments and often take them seriously. A new paper from the Journal of Advertising suggests this reality could have public health implications.

The study, called “Reexamining Health Messages in the Digital Age: A Fresh Look at Source Credibility Effects,” is much more interesting than its title might suggest. Authors Ioannis Kareklas, Darrel Muehling, and T.J. Weber looked at how online commenters influenced people’s responses to vaccination PSAs. Their conclusion? Commenters perceived as credible “are instrumental in influencing consumers’ responses.”

To come to this determination, they performed two experiments. In the first, they showed participants one of two made-up public service announcements: One made the case for vaccination and was presented as coming from the Centers for Disease Control and Prevention, and the other argued against vaccination, from the anti-vax group National Vaccine Information Center. The PSAs were followed by comments that either agreed or disagreed with the PSAs’ content. Participants weren’t told anything about who the commenters were. After looking at the PSAs and comments, people responded to questionnaires that rated their likelihood to vaccinate themselves and their family members, as well as their opinions on vaccination. Based on the data, the researchers concluded that both PSAs and comments influenced people’s opinions on vaccination. Let me restate that: Some people read anonymous Internet comments and, as a result, changed how they felt about vaccines. Yes.

In the second study, the researchers told participants who the commenters supposedly were: One was a student of English literature, one was a health care lobbyist, and one was a medical doctor. The researchers determined which commenter the participants found the most credible—it was the doctor, unsurprisingly—and then determined that participants found the doctor’s comments to be even more credible than the PSAs.

“[W]hen both the sponsor of the PSA and the relevant expertise of the online commenters were identified, the impact of these comments on participants’ attitudes and behavioral intentions was greater than the impact of the PSA and its associated credibility,” they wrote.

In other words, no matter how credible the PSA’s sponsor was, study participants gave more credence to the credible Internet commenters.

Now, that all doesn’t necessarily mean you have to give up on modern civilization—in the best-case scenario, well-informed doctors can amplify important public health messages. But when a doctor (or someone claiming to be a doctor) contradicts science-based PSAs, it’s bad news. The study provides some valuable insight into why the anti-vax movement has been so lethally tenacious and successful. As the paper points out, researchers have long known that people take word-of-mouth communications—both electronic and in person—more seriously than they do advertisements. In fact, Popular Science magazine recently stopped allowing comments because of a study from the University of Wisconsin that showed that such comments could make naive readers think that settled science is up for debate.

Jonah Berger, a marketing professor the University of Pennsylvania’s Wharton School and the author of Contagious: Why Things Catch On, said the new research could have significant implications for public health messaging.

“We’ve known for a long time that people trust word-of-mouth more than they trust advertisements,” said Berger. “What I think is new about this paper is it extends those ideas to public service announcements.”

Just as people often give more credence to Yelp reviews than to restaurant ads, they may put more trust in Internet commenters they perceive to be credible than in public service announcements (or, in the case of the National Vaccine Information Center, public disservice announcements).

So what’s a public service announcer to do when competing with cyberspace’s hordes? For starters, said Berger, don’t ban comments. People are going to find ways to opine, and trying to keep them from doing so is a Sisyphean task. Plus, announcers who ban comments are missing an opportunity, as good comments can boost a PSA’s impact. So instead of preventing responses, groups like the CDC should look for ways to encourage their supporters to post positive responses. In their conclusion, the paper’s authors write that announcers should make sure that “supportive comments are abundant and easily accessible” and should try to foster “credible online exchanges” in comment sections.

For better or for worse, people believe what they read on the Internet. That’s given anti-vaxxers a strategic advantage, as they can hijack legitimate discussions about vaccines with pseudoscience and conspiracy theories. But it also means that public health advocates can use anti-vaxxers’ tactics against them.