Moneybox

Facebook Is Defending Its “Ethnic Affinity” Targeting in Housing Ads

Facebook’s ad targeting tools let you exclude people based on their “ethnic affinity”

David Ramos/Getty Images

ProPublica on Friday highlighted a feature of Facebook’s advertising service that is not only unsettling but may be illegal. Anyone that buys an ad on the social platform is then able to refine the audience that sees it, picking from an array of targeting options. Among these is the option to exclude people based on their “ethnic affinity.” So, for instance, you could make sure the ad is not seen by people whose ethnic affinity is black, or Hispanic.

This feature isn’t new: It generated some debate earlier this year when Universal Studios revealed that it had developed different Straight Outta Compton trailers for black and white audiences during a South by Southwest marketing panel. Facebook has defended the tool as “an opportunity to serve highly relevant ad content to affinity-based audiences.”

As ProPublica points out, however, the feature may run afoul of federal law when it’s applied to ads for housing. The Fair Housing Act of 1968 prohibits, among other things, any housing advertisement that indicates race-based preference or discrimination. Thanks to the Civil Rights Act of 1964, it’s also illegal to discriminate in ads related to employment.

When ProPublica showed Facebook’s demographic exclusion options to civil rights lawyer John Relman, he reportedly gasped and called them “horrifying,” “massively illegal,” and “about as blatant a violation of the federal Fair Housing Act as one can find.”

Facebook’s defense of this tool—as articulated in a public post by its head of U.S. multicultural sales, Christian Martinez—is twofold.

First, Martinez says the company’s policies prohibit discriminatory advertisements, and that if it learns of any such ads it will take “aggressive enforcement action” against them. That might seem a little disingenuous, given that the company’s own tools seem to facilitate, if not encourage, such discrimination. But Martinez says there are legitimate uses for what’s known in the ad industry as “exclusion targeting.” For instance, he says, it could be used to prevent members of a certain affinity group from seeing the general-interest version of an ad for which a more culturally specific version exists. Another Facebook spokesman suggested to ProPublica that it could be used for A/B testing.

Secondly, Facebook insists that excluding people based on ethnic affinity does not equate to discriminating based on race. Facebook does not ask users their race, so it doesn’t know for sure what ethnic group they belong to. It doesn’t really even know individuals’ ethnic affinities: Instead, it infers that based on behavioral signals, such as pages they’ve liked that are associated with certain demographic groups. That means Facebook’s “African-American” ethnic affinity group could include people of other races.

Neither of these defenses seems likely to hold up to close scrutiny.

Any benefits advertisers or consumers might derive from ads that exclude certain “ethnic affinity” groups would seem to be outweighed by the potential societal harm of these ads, at least in realms such as housing and employment. Facebook’s suggestion that it somehow polices its ads to make sure they’re not being “misused” is undermined by the fact that ProPublica’s reporters were able to purchase a housing ad that excluded minority affinity groups—and Facebook approved it within 15 minutes.

Even if ethnic affinity is not identical to race, it’s obviously a proxy for it. Martinez partly dismissed the problem by saying targeted ads are only relevant to people who “choose to affiliate” with ethnic communities. But as Fusion points out, you can see the ethnic affinity Facebook has assigned you by visiting your Ad Preferences page, but you can’t change it. All you can do is opt out of such targeting altogether.

I doubt Facebook is actually trying to facilitate discriminatory ads, for housing or anything else. Rather, this seems like just the sort of tool that would be built by naïve, well-meaning Silicon Valley engineers who lack a deep appreciation for either the law or American history. As Shane Ferro put it:

One software engineer’s response only reinforced her point:

To be fair, Skvorc does not work at Facebook. But while the defenses from the company’s own spokespeople are more politically correct, they aren’t much more convincing. The correct answer from Facebook to ProPublica’s article would probably be something more like: “You’re right, that’s bad. We didn’t mean to build a tool that could be used that way. We’ll fix it ASAP.” Let’s hope we get that version soon.

Meanwhile, here’s one more good question for those at Facebook who see no problem with this tool: