Future Tense

Facebook Was Letting Down Users Years Before Cambridge Analytica

And not even a prescient crackdown by federal regulators stopped it.

Mark Zuckerberg delivers the commencement address at the Alumni Exercises at Harvard's 366th commencement exercises on May 25, 2017 in Cambridge, Massachusetts.
Facebook CEO and Founder Mark Zuckerberg Photo illustration by Slate. Photo by Paul Marotta/Getty Images.

It sounds like the stuff of spy novels. A secretive company backed by an eccentric billionaire taps into sensitive data gathered by a University of Cambridge researcher. The company then works to help elect an ultranationalist presidential candidate who admires Russian President Vladimir Putin. Oh, and that Cambridge researcher, Aleksandr Kogan, worked briefly for St. Petersburg State University. And his research was designed to develop ways to psychologically profile and manipulate voters.

Before we go too deep down the rabbit hole, let’s reiterate that the data Cambridge Analytica gathered to try to target more than 50 million Facebook users in the United States was not stolen from Facebook or removed after some security flaw or “data breach.” The real story is far less dramatic but much more important. It’s such an old story that the Federal Trade Commission investigated it and punished Facebook back in 2011.

It’s such a deep story that social media researchers have been warning about such exploitative practices since at least 2010, and many of us complained when the Obama campaign in 2012 used the same kinds of data that Cambridge Analytica coveted. Obama targeted voters and potential supporters using software that ran outside of Facebook. It was a problem then. It’s a problem now.

But back in 2012, the Obama story was one of hope continued, and his campaign’s tech-savvy ways were the subject of gee-whiz admiration. So academic critics’ concerns fell silent. Just as importantly, Facebook’s reputation in 2012 was at its peak. The platform’s usage kept growing globally as did the glowing, if misleading, accounts of its potential to improve the world after the 2011 revolution in Egypt.

Between about 2010 and 2015, Facebook was a data-exporting machine. Facebook gave data—profiles of users who agreed to take one of those annoying quizzes that proliferated around Facebook between 2010 and 2015, but also records of those who were Facebook friends with those users—to developers who built cute and clever functions onto Facebook. These included games like Mafia Wars, Words with Friends, or FarmVille. You might have played, and thus unwittingly permitted the export of data about you and your friends, to other companies.

In those days Facebook wanted to maximize its user base and the time people spent on Facebook. Thanks in large part to Words with Friends, Facebook succeeded. It now has more than 214 million users in the United States and more than 2.2 billion worldwide.

Until 2015, it was Facebook policy and practice to let application developers tap into sensitive user data as long as users consented to let those applications use their data. But Facebook users were never clearly informed that their friends’ data might also flow out of Facebook or that subsequent parties, like Cambridge Analytica, might reasonably get hold of the data and use it however they wished.

The Federal Trade Commission saw this as a problem. In 2011 the agency released a report after an investigation revealed that Facebook had deceived its users over how personal data was being shared and used.

Among other violations of user trust, the commission found that Facebook had promised users that third-party apps like FarmVille would have access only to the information that they needed to operate. In fact, the apps could access nearly all of users’ personal data—data the apps didn’t need. While Facebook had long told users they could restrict sharing of data to limited audiences like “Friends Only,” selecting “Friends Only” did not limit third-party applications from vacuuming up records of interactions with friends.

The FTC’s conclusions were damning. They should have alarmed Americans—and Congress—that this once-huggable company had lied to them and exploited them.

Through a consent decree with the commission, Facebook was barred from making misrepresentations about the privacy or security of consumers’ personal information. It was required to obtain consumers’ affirmative express consent before overriding privacy preferences. And Facebook was required to prevent anyone from accessing a user’s material more than 30 days after the user had deleted his or her account.

Most importantly, Facebook had to proactively police its application partners and its own products to put user privacy first.

The consent decree put the burden on Facebook to police third parties like Kogan, the Obama campaign, and the makers of FarmVille. Facebook was responsible for making sure fourth parties, like Cambridge Analytica, did not get and use people’s information. We now know how well Facebook lived up to that responsibility.

Facebook shut down this “friends” data-sharing practice in 2015, long after it got in trouble for misleading users but before the 2016 election went into high gear. Not coincidentally, Facebook began embedding consultants inside major campaigns around the world.

For 2016, Facebook would do the voter targeting itself. Now Facebook is the hot new political consultant because it controls all the valuable data about voter preferences and behavior. No one needs Cambridge Analytica or the Obama 2012 app if Facebook will do all the targeting work and do it better.

This is the main reason why we should stay steady at the rim of the Cambridge Analytica rabbit hole. Cambridge Analytica sells snake oil. Its “psychometric” voter targeting systems don’t work. No campaign has embraced them as effective. And Cambridge Analytica CEO Alexander Nix even admitted that the Trump campaign did not deploy psychometric profiling. Why would it? It had Facebook to do the dirty work for it. Cambridge Analytica tries to come off as a band of data wizards. But they are simple street magicians hoping to fool another mark and cash another check.

So now, to hear Facebook officials complain that they were tricked or victimized by Cambridge Analytica is rich. It was Facebook’s responsibility—by law—to prevent application developers from doing just what Kogan and Cambridge Analytica did. Facebook failed us, and not for the first time.

While focused on Cambridge Analytica’s psychometric snake oil and on its ties to Russia and to Trump, we are missing the real story: This massive data exporting was Facebook policy and practice from 2010 to 2015. The problem with Facebook is Facebook.