Skip to main content

Facebook’s ad delivery could be inherently discriminatory, researchers say

Facebook’s ad delivery could be inherently discriminatory, researchers say

Share this story

Illustration by Alex Castro / The Verge

A new study says that Facebook’s ad delivery algorithm discriminates based on race and gender, even when advertisers are trying to reach a broad audience. The research backs up a similar claim that the US Department of Housing and Urban Development made last week when it sued Facebook for breaking housing discrimination laws. It also expands the scope of an already potentially damning body of research about online advertising and bias, adding new fuel to the push for regulation.

Numerous reports have looked at how advertisers can target ads to exclude certain groups, but this study examines how the ads are delivered once they’re out of advertisers’ hands. Even if an ad is targeted broadly, Facebook will serve it to the audiences most likely to click on it, generalizing from information from their profile and previous behavior. The system builds correlations to find this ideal audience: if techno fans are particularly likely to click on a specific ad for headphones, that ad might be served more to other techno fans in the future, even if it wasn’t an explicit targeting parameter.

“Previously unknown mechanisms” that control who sees an ad

The paper (which has not yet been peer-reviewed) is a collaboration between Northeastern University, the University of Southern California, and nonprofit organization Upturn. Its authors tested whether job listings or housing ads with certain keywords or images would be automatically delivered more often to certain groups, exposing what they call “previously unknown mechanisms” that could violate anti-discrimination rules. The researchers spent over $8,500 on ads that they say reached millions of people, linking to actual job-hunting or real estate sites, among other categories. They ran the same campaigns with different ad copy or photos or at different price rates, checking the demographic breakdowns provided by Facebook on each campaign.

Some simple changes turned up dramatic splits. Housing ads with a photograph of a white family, for instance, were apparently served to more white users than the same ad with a black family. (Facebook doesn’t offer analytics directly based on race, so the researchers aimed ads at locations with different racial breakdowns as a proxy.) An ad for lumber industry jobs was shown to an audience that was 90 percent male, while ads for supermarket cashiers reached an 85 percent female audience. And unlike the ads in a well-known ProPublica exposé, these weren’t specifically aimed at men or women. The only difference was in the text and photos.

Spending rates also seemingly affected who saw the ad. Facebook ads are placed through a bidding process, so a campaign backed by more money may end up reaching more “valuable” users. In this case, an ad with a very cheap campaign had an audience that was 55 percent male, compared to a high-budget campaign, whose audience was over 55 percent female.

HUD’s recent lawsuit claimed that by serving ads based on “relevance,” Facebook is likely reinforcing social inequalities: if most home buyers in an area are white, for instance, Facebook might only show ads to white users. It was presented as an untested theory, but this research offers significant support to the idea.

“We can’t say exactly how these calculations are done.”

The researchers stress that they still don’t really know why Facebook’s algorithm is making any of these decisions. “We were able to say with confidence from this study that the content of the ad itself matters a lot to the kinds of people that see it. But we can’t say exactly how those calculations are done,” says Aaron Rieke of Upturn.

Reached for comment, Facebook stressed that it was trying to eradicate bias. “We stand against discrimination in any form. We’ve announced important changes to our ad targeting tools and know that this is only a first step. We’ve been looking at our ad delivery system and have engaged industry leaders, academics, and civil rights experts on this very topic — and we’re exploring more changes,” said spokesperson Joe Osborne.

Osborne said that Facebook was actively studying its algorithms, and he noted that Facebook had supported a US House of Representatives resolution on ethical AI development. He also pointed to Facebook’s earlier ad-targeting changes, which include removing categories that ad buyers could use to discriminate as well as building a tool for users to check all housing ads in its system, regardless of what they see in their news feeds.

This study suggests that changing ad-targeting options might not make these listings meaningfully neutral, and Rieke says that a separate ad database wouldn’t go far enough. “It’s certainly a good thing that eventually people will be able to go search all the housing advertisements,” he says. “Even so, I think it matters who Facebook chooses to really push the opportunities in front of.”

Other ad networks might face the same issues

Facebook has argued that Section 230 of the Communications Decency Act shields it from liability for advertising content. But one of the researchers’ major arguments is that Facebook is single-handedly defining these audiences, and advertisers may have little say over how it’s done. “We didn’t say ‘masculine lumberjacks wanted,’” says Rieke. “We took pains to be very clear and neutral in the language of our test advertisements, and we saw these results nonetheless. This is not an issue where advertisers just need to be more careful about the content of their ads.”

So how would Facebook create a system that could avoid legal scrutiny? It could suspend targeted advertising on posts for jobs or housing, or it could change its targeting system to actively counter bias. It could also shunt these listings to a separate system, like the housing ad database Facebook has promised to build.

For now, we don’t know if this paper will affect HUD’s lawsuit against Facebook; the agency declined to comment, citing restrictions on talking about an active legal dispute. But if the case goes to trial, HUD might seek internal data that would back up the paper’s conclusions.

If a court rules that Facebook’s ad placement algorithm is discriminatory, advertising networks across the web might have to change their practices. The researchers say Facebook’s “walled garden” made it particularly suited for this experiment, but it’s plausible that Google or any other ad platform could display the same biases. “We did not yet measure other advertisers,” says co-author Piotr Sapiezynski. “But we do suspect platforms that try to reach whatever they define as ‘relevant’ audiences might run into this situation.”