YouTube redirects those who search for terrorist keywords to anti-terrorist content.

YouTube Starts Redirecting People who Search for Certain Keywords to Anti-Terrorist Videos

YouTube Starts Redirecting People who Search for Certain Keywords to Anti-Terrorist Videos

Future Tense
The Citizen's Guide to the Future
July 21 2017 3:34 PM

YouTube Starts Redirecting People who Search for Certain Keywords to Anti-Terrorist Videos

YouTube_logo_2015.svg
An advertiser boycott may have driven YouTube’s latest move

YouTube

On Thursday, YouTube announced a new effort to push back against terrorist recruitment efforts on the site. As the company announced in a blog post, “[W]hen people search for certain keywords on YouTube, we will display a playlist of videos debunking violent extremist recruiting narratives.” Arising out of partnerships with nongovernmental organizations, this new feature is part of a larger project called the Redirect Method, an effort specifically targeted at those vulnerable to ISIS’s messaging.

It’s also part of a larger YouTube strategy, one that  Google (YouTube’s corporate parent) counsel Kent Walker laid out last month in a blog post. That announcement came in part out of a response to an advertiser boycott earlier in the year, one driven by companies frustrated to find their own clips running in front of terrorist videos. In response, as Variety reported at the time, Google claimed that it would “be taking new steps to improve controls advertisers have over where their ads appear on YouTube.”

Advertisement

But as Walker explained in his June post, the company was “pledging to take four additional steps” as it worked to actively combat extremism on its platform: It was stepping up technological-identification of terrorist videos, increasing human flagging of such content, more aggressively some videos that don’t directly violate the terms of service, and “expand[ing] its role in counter-radicalisation efforts.” This newly announced redirection strategy seem to be a a product of that fourth and final prong.

In framing both the problem and its approach to it, Google is careful to avoid rhetoric that would suggest it intends to engage in censorship. That’s less of a concern in Europe, where courts have found that free speech laws do not protect extremist videos. But tech companies walk a finer line in the United States, “where free speech rules are broader,” as the Verge observes in a post on related efforts to rein in terrorist content.

As it grapples with this potential concern, YouTube appears to be stressing that it stands in opposition to those who would silence others. Note, for example, how Walker opens his blog post with the phrase, “Terrorism is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all.” If terrorists oppose “open societies,” then any attempt to combat them should be in the service of defending openness, a conceit that grows fuzzy if technology companies are seen to be silencing some of their users.

In this sense, YouTube’s embrace of the redirect method looks like a smart strategy. It is, as it makes clear, actively removing content that violates its terms of service. But it also gives the impression of a company more focused on drowning out ugly voices than in actively eliminating them. Here, there’s a small but potentially important detail in its announcement: As it moves ahead, YouTube hopes to collaborate “with expert NGOs on developing new video content designed to counter violent extremist messaging at different parts of the radicalization funnel.” Significantly, redirection has the potential to reach those who come looking for terrorist videos, whether or not they’re present on the site.

All that said, it remains to be seen how effective the redirect method will be.

As the Verge reports, “An earlier pilot of the Redirect Method led to 320,000 individuals viewing ‘over half a million minutes of the 116 videos we selected to refute ISIS’s recruiting themes.’ ” While that’s promising, it may run aground against the ways that terrorists get around YouTube’s existing content restrictions. In a long article on the topic, Motherboard writes, “[I]n order to prevent users from flagging explicit or inflammatory extremist videos, terrorist media groups and disseminators like The Upload Knights and AQ’s As-Sahab Media Foundation often label YouTube videos as ‘unlisted,’ meaning that the videos cannot be searched—only accessed if you are given the link.” If potential recruits are finding extremist material by other means, search redirects may not make that much of a difference.

Future Tense is a partnership of SlateNew America, and Arizona State University.