Technology

Facebook’s Broken Promises

The company’s failure to block discriminatory housing ads illustrates why it can’t be trusted to police itself.

Facebook said it had news tools to block housing discrimination on its advertising platform. It wasn’t enough.

AFP/Getty Images

Thank goodness for ProPublica. A year ago, the investigative journalism nonprofit showed how Facebook’s advertising tools could be used to exclude black people and other racial groups from the audience for housing-related ads. This kind of racial exclusion in real estate, common in the Jim Crow era, was made illegal in 1968 under the federal Fair Housing Act.

Facebook initially defended itself, claiming that targeting people by “Ethnic Affinity” was not the same as racial discrimination. In February, it reversed course, announcing changes to its advertising policies and new enforcement tools to prevent illegal discrimination on its ad platform. (It also changed the name of its “ethnic affinity” targeting category to the even-more-euphemistic “multicultural affinity.”)

This next part is important, and easy to overlook: Apparently satisfied by Facebook’s pledge to address the problem, the U.S. Department of Housing and Urban Development told ProPublica this week that it had closed its inquiry into housing discrimination on the company’s platform.

Perhaps you already know where this is going. ProPublica this week tested Facebook’s ad platform again, using its targeting tools to exclude African-Americans and several other protected demographics from the audience for promoted posts about rental opportunities. These groups included “mothers of high school kids, people interested in wheelchair ramps, Jews, expats from Argentina, and Spanish speakers,” as well as “people interested in Islam, Sunni Islam, and Shia Islam.” 

These are exactly the types of ads Facebook had pledged to clamp down on. They’re all illegal. And Facebook approved every single one. This, from a company that has insisted to government authorities that it is capable of “self-regulation.”

“This was a failure in our enforcement and we’re disappointed that we fell short of our commitments,” Facebook said in a statement that is beginning to sound rather familiar. The company added, “The rental housing ads purchased by ProPublica should have but did not trigger the extra review and certifications we put in place due to a technical failure.”

That “technical failure,” it seems, was on the part of the machine-learning software that Facebook put in place to try to automatically detect housing ads, so that it could disable potentially discriminatory targeting options. No one familiar with the state of A.I. should be surprised by this. Reliably interpreting the subject matter of text or images is really hard for even the world’s best machine-learning programs. It seems like Facebook threw some code together, called the problem solved, and didn’t bother to see how well it was actually working.

When I asked Facebook how accurate their system is in detecting housing ads, the company declined to comment. A spokesperson said only that Facebook is continually testing and improving such tools. But if that were the case, it should have known before now how poorly they were working. Instead, it once again took ProPublica’s investigative team to reveal a problem that Facebook itself should have discovered, disclosed, and addressed long ago.

So why didn’t it? Why, even after being publicly called out and subjected to a HUD inquiry and pledging in no uncertain terms to fix the problem, did Facebook fail to fix it? Or, if it couldn’t fix it, why didn’t it come forward and admit that the problem was proving tough to crack, rather than allowing the world to think it had been solved?

I don’t think the answer is that Facebook wants to allow discriminatory housing ads on its platform, any more than it wants people selling guns or meddling in foreign elections on its platform. It’s just that fixing these problems requires time, resources, and, yes, manpower—all of which not only cut into Facebook’s profits but run counter to its entire culture and philosophy.

Facebook built its $500 billion business on the premise that all it had to do was build the platform and the software, and its users would do all the hard parts: create content, rate and respond to others’ content, flag what’s inappropriate, create and place ads, and so on. Now society is asking—and expecting—the company to develop expertise and enforcement capacity in things like housing law, election law, fact-checking, and journalistic merit that don’t lend themselves to automation. And the company is acting like it has all of this under control, when it clearly doesn’t, because it fears the consequences if it were to admit the truth.

In short, it fears government intervention in its business, because that would constrain its actions in ways that deeply threaten its “hacker” culture, runaway growth, and revenue. And so its lobbyists push instead for “self-regulation,” in which the government and public agree to trust Facebook to develop and enforce its own policies on matters of public interest.

But Facebook’s recent history makes it increasingly clear that this sort of trust is misplaced. Its tendency is first to deny or downplay a problem, as when CEO Mark Zuckerberg insisted that fake news had “no impact” on the U.S. election, or when the company initially disclosed a mere $100,000 in Russian political spending. Only when compelled by external pressure does the company admit that the problem is larger and more serious and make sincere-sounding pledges to address it. Yet even then, it seems, the company lacks mechanisms to make sure its fixes are actually working—and doesn’t tell anyone when they aren’t.

The question of how to regulate a company like Facebook is a thorny one. But it’s evident by now that the company is incapable of policing itself. It’s no different, in that respect, from most other big corporations—and the sooner we recognize that, the sooner we can get to work on figuring out exactly what sorts of changes we as a society should demand of Facebook, and how to make sure we get them.