Should Google Try To Stop the Spread of Anti-Vaccine Activism?

What's to come?
Jan. 23 2012 7:43 AM

Warning: This Site Contains Conspiracy Theories

Does Google have a responsibility to help stop the spread of 9/11 denialism, anti-vaccine activism, and other fringe beliefs? 

(Continued from Page 1)

Thus, attempts to influence communities that embrace pseudoscience or conspiracy theories by having independent experts or, worse, government workers join them—the much-debated antidote of “cognitive infiltration” proposed by Cass Sunstein (who now heads the Office of Information and Regulatory Affairs in the White House)—won't work. Besides, as the Vaccine study shows, blogs and forums associated with the anti-vaccination movement are aggressive censors, swiftly deleting any comments that tout the benefits of vaccination.

What to do then? Well, perhaps, it's time to accept that many of these communities aren't going to lose core members regardless of how much science or evidence is poured on them. Instead, resources should go into thwarting their growth by targeting their potential—rather than existent—members.

Today, anyone who searches for "is global warming real" or "risks of vaccination" or "who caused 9/11?" on Google or Bing is just a few clicks away from joining one of such communities. Given that censorship of search engines is not an appealing or even particularly viable option, what can be done to ensure that users are made aware that all the pseudoscientific advice they are likely to encounter may not be backed by science?

Advertisement

The options aren't many. One is to train our browsers to flag information that may be suspicious or disputed. Thus, every time a claim like "vaccination leads to autism" appears in our browser, that sentence would be marked in red—perhaps, also accompanied by a pop-up window advising us to check a more authoritative source. The trick here is to come up with a database of disputed claims that itself would correspond to the latest consensus in modern science—a challenging goal that projects like “Dispute Finder” are tackling head on.

The second—and not necessarily mutually exclusive—option is to nudge search engines to take more responsibility for their index and exercise a heavier curatorial control in presenting search results for issues like "global warming" or "vaccination." Google already has a list of search queries that send most traffic to sites that trade in pseudoscience and conspiracy theories; why not treat them differently than normal queries? Thus, whenever users are presented with search results that are likely to send them to sites run by pseudoscientists or conspiracy theorists, Google may simply display a huge red banner asking users to exercise caution and check a previously generated list of authoritative resources before making up their minds.

In more than a dozen countries Google already does something similar for users who are searching for terms like "ways to die" or "suicidal thoughts" by placing a prominent red note urging them to call the National Suicide Prevention Hotline. It may seem paternalistic, but this is the kind of nonintrusive paternalism that might be saving lives without interfering with the search results. Of course, such a move might trigger conspiracy theories of its own—e.g. is Google shilling for Big Pharma or for Al Gore?—but this is a risk worth taking as long as it can help thwart the growth of fringe movements.

Unfortunately, Google's recent embrace of social search, whereby links shared by our friends on Google's own social network suddenly gain prominence in our search results, moves the company in the opposite direction. It's not unreasonable to think that denialists of global warming or benefits of vaccination are online friends with other denialists. As such, finding information that contradicts one's views would be even harder. This is one more reason for Google to atone for its sins and ensure that subjects dominated by pseudoscience and conspiracy theories are given a socially responsible curated treatment.

This article arises from Future Tense, a collaboration among Arizona State University, the New America Foundation, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.

Evgeny Morozov is a contributing editor at the New Republic and the author of To Save Everything, Click Here: The Folly of Technological Solutionism.