Skip to main content

Facebook signs agreement saying it won’t let housing advertisers exclude users by race

Facebook signs agreement saying it won’t let housing advertisers exclude users by race


The agreement with Washington state forces Facebook to make nationwide changes

Share this story

Illustration by James Bareham / The Verge

Facebook has signed a new, legally binding agreement with the state of Washington agreeing to remove advertisers’ ability to exclude races, religions, sexual orientations, and other protected classes in certain ad-targeting sectors. The announcement was made today by Washington Attorney General Bob Ferguson, who spearheaded a 20-month investigation into the social network, which first came under fire for the practice in 2016 and again last year.

“Facebook’s advertising platform allowed unlawful discrimination on the basis of race, sexual orientation, disability and religion,” said Ferguson in a statement. “That’s wrong, illegal, and unfair.” The agreement Facebook has signed gives it 90 days to comply with the required changes. The Washington state attorney general’s office says the changes will be implemented nationwide, while the agreement will be legally binding in the state of Washington.

Facebook has agreed to stop letting some advertisers exclude users based on race

As per the terms of the deal, Ferguson is closing his investigation into whether Facebook’s ad-targeting tools violated Washington’s Consumer Protection Act and its Law Against Discrimination, so long as Facebook complies. Facebook says it removed its “multicultural affinity” category from its exclusion tool back in April, but that this agreement involves taking further steps to protect users from discriminatory practices.

“We appreciate Attorney General Ferguson’s attention to this important matter and are pleased to have reached an agreement with his office,” writes Will Castleberry, Facebook’s vice president of state and local policy, in a statement given to The Verge. “We’ve worked closely with them to address the issues they’ve raised. Discriminatory advertising has no place on our platform, and we’ll continue to improve our ad products so they’re relevant, effective, and safe for everyone.”

Facebook’s ad-targeting tools were first highlighted in an investigative report from nonprofit ProPublica nearly two years ago. The report found that it was relatively easy to exclude certain races, languages, religion affiliations, and other categories from ads for housing, credit, employment, and insurance, despite this being a clear violation of the federal Fair Housing Act. (For other types of advertising — like for films, food, and clothing, for instance — this type of targeting is allowed and quite widespread in television advertising.)

Facebook initially attempted to dodge claims of wrongdoing by claiming its platform does not categorize users by these characteristics, but rather, it lets users self-identify with “affinities,” which is the company’s euphemism of choice for race, ethnicity, and other identifiers. Despite that, Facebook pledged to police its ad platform more proactively and to use a mix of automated software and human moderation to weed out discriminatory advertising.

A year later, a second ProPublica investigation found that Facebook had done little to actively clean up its ad platform or enforce its new targeting rules around housing and other federally protected categories. The investigation found that it was still easy to get ads approved within minutes that excluded groups like African-Americans, Jews, and Spanish speakers for housing and employment.

The only visible change Facebook seemed to make was to change the phrasing of the targeting tool from “ethnic affinity” to “multicultural affinity,” and to reclassify it as a behavior instead of a demographic. In response to the second report, Facebook said it would temporarily disable advertisers’ ability to use the targeting tools in question. Facebook has since removed the ability to target users for ads based on its multicultural affinity groups across the board. It now includes a warning about discriminatory practices when using the exclusion targeting tool on its ad platform alongside a link to its guidelines.

Image: Facebook

“Throughout these reports, Ferguson’s office continued to investigate Facebook’s targeting options because changes made to its advertising platform were only temporary and limited in scope,” reads a press release from the Washington State AG’s office. “Investigators for the Attorney General’s Office discovered that advertisers could still exclude people based on several other protected classes, such as sexual orientation, religious affiliation and veteran status. Investigators also found that the problem extended beyond advertisements for housing, employment and credit to those for public accommodations and insurance.”

Now, after having already removed multicultural affinities from its exclusion tool, it sounds like Facebook will remove a number of other identifiers — including veteran and military status, disability status, national origin, and sexual orientation. The changes will impact advertising for employment, housing, credit, and insurance, as per the Washington AG office’s guidelines.

Facebook’s ad platform still allows advertisers to exclude users by age

“According to the assurance of discontinuance, Facebook will fix its advertising platform to remove the unlawful targeting options within 90 days,” reads the press release. “The social network service also will pay the Washington State Attorney General’s Office $90,000 in costs and fees.” Facebook is still facing a lawsuit from civil rights groups alleging the social network enabled housing discrimination.

While Facebook has pledged to make these changes, there are still other aspects of its ad platform that critics take issue with. Peter Romer-Friedman, a lawyer with Outten & Golden LLP and the lead attorney on two age and racial discrimination lawsuits filed against Facebook in the Northern District of California, says the agreement today does nothing to address age discrimination or gender discrimination on Facebook.

Romer-Friedman noted how it’s still easy to discriminate against older users when targeting employment ads. He also noted how advertisers on Facebook can still engage in racial discrimination by doing what’s known as redlining, which, in the context of Facebook, involves excluding certain races and ethnic groups in ads by targeting those ads to distinct ZIP codes.

“Facebook is not doing much more than it already had pledged to do last fall,” he said in an interview. “I think it’s a step in the right direction that the Washington AG got Facebook to commit to do these things and make it legally binding, but this is a very small step and a lot more work has to be done to make Facebook truly non-discriminatory.”

Update 7/24, 5:05PM ET: Added additional information and interview details.