Facebook can be sued over allegations that its advertising algorithm is discriminatory, a California state court of appeals ruled last week. The decision stems from a class action lawsuit filed against Facebook in 2020, which accused the company of not showing insurance ads to women and older people in violation of civil rights laws.
The case centers around Samantha Liapes, a 48-year-old woman who turned to Facebook to find an insurance provider. The lawsuit alleges that Facebook’s ad delivery system didn’t show Liapes ads for insurance due to her age and gender.
In a September 21st ruling, the appeals court reversed a previous decision that said Section 230 (which protects online platforms from legal liability if users post illegal content) shields Facebook from accountability. The appeals court concluded that the case “adequately” alleges that Facebook “knew insurance advertisers intentionally targeted its ads based on users’ age and gender” in violation of the Unruh Civil Rights Act.
It also found significant similarities between Facebook’s ad platform and Roommates.com, a service that exceeded the protections of Section 230 by including drop-down menus with options that allowed for discrimination. “There is little difference with Facebook’s ad tools” and their targeting capabilities, the court concluded. “Facebook does not merely proliferate and disseminate content as a publisher ... it creates, shapes, or develops content” with the tools.
Facebook’s ad algorithm has faced scrutiny for years now, with a federal lawsuit filed in 2018 accusing the company of enabling housing discrimination and subsequent studies backing up these claims. Facebook settled with the US government in 2022 and launched a new ad distribution system to address housing discrimination earlier this year.