Future Tense

What Was Russia Up To?

Here’s what we know about how Russia tried to use Facebook, Google, and Twitter to sway the 2016 election.

A 3D plastic representation of the Facebook logo is seen in this photo illustration May 13, 2015.

Dado Ruvic/Reuters

Almost 130,000 people liked a meme featuring Donald Trump dressed as Santa Claus in the Oval Office on Facebook. Posted in December of last year, the text above the image asked Facebook users to “Remember about this stupid PC idea to forbid people from saying ‘Merry Christmas’, instead forcing them saying ‘happy holidays’. ” The post, which came from the Facebook page of Secured Borders, continued, “Thankfully this PC absurd soon will be over.” The grammar is all off, likely because the person who posted it doesn’t usually speak English. Secured Borders, according to the New York Times, is believed to be the work of a Russia-linked group that used Facebook to broadcast anti-immigrant, pro-Trump commentary leading up to, and after, Donald Trump won the election.

Facebook

Facebook, Google, and Twitter have all admitted that Russian government-linked groups used their services to give Trump a boost in the 2016 presidential election. On Nov. 1, all three of the companies implicated have been invited to testify at a public hearing in front of the Senate Intelligence Committee. Expect to see executives grilled by senators looking to learn exactly how Kremlin-linked content proliferated on their platforms, what their companies were aware of, and why they didn’t do more to stop it.

In the meantime, thanks in large part to leaks, we’re getting a better idea of what Russia’s interference actually looked like. But it can be difficult to keep the stories—which seem to come out every day—straight. Here’s what we know so far.

Facebook

By far the social network that’s gotten the most attention for Russian activity in the lead-up to the election is Facebook. The company itself has revealed that roughly 470 Kremlin-backed groups and accounts spent some $100,000 on 3,000 political ads on Facebook, which reached roughly 10 million people. Facebook says that 50 percent of those cost the advertiser less than $3, and for 99 percent of the ads, Facebook was paid less than $1,000.

But according to new research from Jonathan Albright, research director of the Tow Center for Digital Journalism at Columbia University, Russian trolls may have reached far more than 10 million Americans in the run-up to the election. The Facebook numbers only included the number of people who saw paid ads. But Russian-backed pages also published free posts and then paid for them to be boosted, which isn’t technically considered an ad by the Federal Election Commission’s standards. Albright found that just six of the Russian-backed accounts alone had been shared about 340 million times. Those posts came from pages that masqueraded as American political interest groups: Blacktivists, United Muslims of America, Being Patriotic, Heart of Texas, Secured Borders, and LGBT United. LGBT United fronted as a gay rights activist page, for example.

The actual posts that the Russian-linked accounts made showed an acute understanding of the far edges of America’s deeply polarized political landscape, which they clearly sought to widen and exploit. Some posts were in support of the Black Lives Matter movement, while others suggested racial justice activists are the cause of political unrest in the country. Others were aimed at veterans and people in the military. Some supported Jill Stein and Bernie Sanders. One of the Russian-bought ads, the Washington Post reported, included photos of a black woman pulling the trigger of an unloaded gun, which could have been intended to sow fear among white Facebook users.

These Russian-backed pages and posts were often rife with grammatical errors, like an anti-gun control post reported by the New York Times featuring a young woman holding a rifle that read: “Why do I have a gun? Because it’s easier for my family to get me out of jail than out of cemetery.” Apparently, it’s a common mistake for a Russian speaker to drop the article a before a noun. Often fake accounts pretending to be Americans were created—with borrowed profile pictures, photos of kids, and fake names—and used to post news articles that reflected Russia’s political interests.

There were also event pages. The Daily Beast reported that Russian Facebook accounts attempted to organize more than a dozen Florida pro-Trump rallies during election season in the swing state, which Trump ultimately won by 1.2 percentage points. Dozens of real people in the U.S. actually showed up to these Russian-organized events.

Twitter

Prior to the election, a team from Oxford University’s Project on Computational Propaganda found that pro-Trump bots (that is, automated accounts) tweeted with debate-related hashtags seven times more than pro-Clinton bots during the third presidential debate. At that time, Russia was not implicated.

But in a closed meeting between Twitter and the Senate Intelligence Committee on Sept. 28, the company shared that 22 of the 470 Russian propagandist Facebook accounts had corresponding accounts on Twitter. Another 179 Twitter accounts appeared linked or related to the Russians, according to a blog post the company published after its Senate briefing. Twitter also shared ad-buy information for three accounts associated with the Russian government–backed news outlet RT, which apparently spent $274,100 on U.S. ads in 2016. According to Twitter, most of the RT ads were “directed at followers of mainstream media and primarily promoted RT Tweets regarding news stories.” But really, there’s nothing surprising about a news outlet, Kremlin-supported or otherwise, spending money to advertise on Twitter.

Sen. Mark Warner of Virginia, a top Democrat on the committee, was unimpressed. In a press conference following the testimony, Warner called the information Twitter shared “inadequate” and “deeply disappointing,” since the company basically just curbed from information Facebook had already provided. While Twitter only shared information on 201 accounts linked to Russian propagandists, researchers and academics have found hundreds more, including a sampling of 600 accounts monitored by the Alliance for Securing Democracy, a project of the German Marshall Fund that tracks efforts to undermine democratic governments.

Russian bots on Twitter often fired off multiple tweets a minute, a rate that at one point caused #HillaryDown, an anti-Clinton hashtag, to trend, security researchers from the firm FireEye told the New York Times. Russians also used Twitter to target Americans in key electoral swing states, such as Michigan, Pennsylvania, and Florida, according to another study by the team at Oxford.

Google

Google has been the quietest of the three companies primarily implicated in the Russian spread of misinformation, but on Monday, the New York Times reported that the company found nearly $5,000 worth of ads believed to be connected to the Russian government and another $53,000 worth of ads on political issues that were bought from Russian internet addresses, physical addresses, or with rubles. But there’s no clear Russian government connection to that second set, according to the Times. One of the accounts reportedly spent $36,000 on Google ads that focused on whether President Obama should resign. Another set of ads promoted the film You’ve Been Trumped, a documentary critical of the now-president’s effort to build a golf course in Scotland along an environmentally protected site.

Russian news outlets also gained popularity on YouTube. There, Bloomberg reported Oct. 3, Russia Today and RT America combined were the second most popular channels in YouTube’s news section. Videos Americans posted on YouTube were sometimes recycled on Russian-linked Facebook accounts, according to the New York Times.

Facebook and Twitter have confirmed that they both plan to attend the Nov. 1 hearing on Capitol Hill. Google has not. Yet Google is the largest online ad company in the world, period. If Russian-linked propagandists were going to try to reach as many Americans as possible in an effort to elect Trump and sway the U.S. democratic process, it’s hard to imagine a situation where Google and YouTube weren’t used regularly.

And the rub with all of this is that these platforms worked exactly as they were intended to work—allowing marketers to microtarget users based on their previous internet activities and flood people’s streams with engaging content. This system has led Facebook and Google to become two of the most powerful companies on the planet, and disrupting the flow of that cash won’t be easy for Congress.

Read more in Slate about Russia’s 2016 election meddling.

This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, follow us on Twitter and sign up for our weekly newsletter.