Last week, the US Department of Housing and Urban Development sued Facebook for allegedly making housing discrimination easy. It claimed that Facebook violated the Fair Housing Act by letting ad buyers target audiences that included or excluded certain races, religions, or genders. The move took many people by surprise, apparently including Facebook. Soon, it could test how long-standing rules against housing discrimination intersect with the sometimes controversial laws covering web platforms and how they apply to the vast, often little-understood advertising networks that help power the internet.
Facebook has skirted around these issues before. The company had just reached a settlement in a related lawsuit, agreeing to eliminate several targeting categories. It was also working with HUD to address housing discrimination concerns until talks supposedly broke down over a request for user data. Now, the company will be fighting charges that can result in substantial financial penalties and meaningful changes to the way its ad system operates. If HUD establishes that Facebook violated the Fair Housing Act, it’s setting the stage for lawsuits against Google and Twitter, both of which are reportedly under scrutiny as well.
This lawsuit is unusual for a variety of reasons, including the fact that HUD often leaves lawsuits up to local agencies, says Bleakley Platt & Schmidt attorney James Glatthaar, who has dealt extensively with housing discrimination cases. (Disclosure: a Vox Media video team member’s father also works at Bleakley Platt & Schmidt.)
Glatthaar says HUD may have taken the investigation on because Facebook is a relatively unique case. “Something like this is a national practice, whereas most housing discrimination is fairly local,” he says. “The federal government is one of the few entities that can handle something of this scope.” The agency might also want to specifically address a gray area in the law, make a public statement about housing discrimination, or simply make an example of Facebook.
HUD’s suit isn’t the first case involving an online service. Rental service Roommates.com and classified site Craigslist both went to court in 2008 to fight charges that they’d enabled racist or otherwise exclusionary listings. Both companies claimed protection under Section 230 of the Communications Decency Act, which shields web platforms from liability for user posts. Craigslist successfully made the defense, but Roommates.com was held partially liable because it had offered a survey that included discriminatory questions.
When civil rights groups sued Facebook over housing discrimination last year, Facebook used Section 230 as a defense, saying it had only provided a set of general advertising tools. That argument wasn’t tested in court, however, so we don’t know how courts will treat ad-targeting options like Facebook’s now-defunct “ethnic affinity” selector.
If Facebook defends itself with Section 230 again, this could make for a major test of the rule, which has grown politically controversial in recent years. Over the past year, two court cases — one against Grindr, the other against Yelp — have affirmed Section 230’s safe harbor protections, and a ruling here could help establish just how far they reach. Also, support for the rule among lawmakers is weakening, and a case involving a powerful company like Facebook could spark further debate over it.
HUD is also making some additional claims that could complicate Facebook’s defense. In addition to calling out tools that let advertisers select audience categories, it’s condemning the invisible process Facebook uses to serve ads. “[Facebook’s] ad delivery system prevents advertisers who want to reach a broad audience of users from doing so,” it says, because it’s likely to steer away from “users whom the system determines are unlikely to engage with the ad, even if the advertiser explicitly wants to reach those users.”
HUD doesn’t have to establish that these targeting algorithms are designed to avoid showing ads to certain protected classes. It just has to demonstrate that the system effectively makes housing less accessible to these people — a concept known as disparate impact. “If there is an algorithm that just happens to discriminate against racial minorities or gender minorities or whatever, I think it would still be problematic,” says Glatthaar’s colleague Adam Rodriguez. He compares the move to a zoning restriction whose text and intent is race-neutral but that directly results in fewer black residents, which would likely still be considered discriminatory.
In its defense last year, Facebook claimed that there were no concrete cases of advertisers denying access to real housing options. Groups like the National Fair Housing Alliance and ProPublica had purchased fake ads to prove discrimination was possible, and Facebook later acknowledged that these groups raised “valid concerns” about its practices. But it argued that their evidence only showed “the possibility that some unidentified third parties may use those tools to place real discriminatory ads.”
Rodriguez says Facebook may have felt confident in refusing HUD’s request for data because it believed the agency’s legal case was weak. But if the case goes to trial, the stakes are still high. HUD is requesting unspecified financial damages, plus civil penalties that can reach $50,000 per violation, and it’s not clear how many violations a court might say Facebook accrued, given its massive size. Also, as the trial progresses, Facebook might have to let HUD comb through emails or other documents, potentially producing hard evidence of people being denied access to real housing listings — or just giving HUD (or other critics) access to potentially unflattering information.
For now, Facebook has said it will “continue working with civil rights experts” on improving its advertising practices. It may keep looking for a settlement with HUD, sending this case down the same path as its last housing lawsuit and leaving the big legal questions up in the air. “I don’t think we can predict how this one is going to turn out,” says Glatthaar. “There’s no obvious answer.”