Skip to main content

Anti-social media lawsuits are coming for Roblox and Discord

Anti-social media lawsuits are coming for Roblox and Discord

/

The suit alleges social platforms are addictive and failed to keep adults away from children

Share this story

The Discord logo.
The Discord logo
Illustration by Alex Castro / The Verge

Roblox and Discord are among the platforms sued for allegedly harming children and teens in a new lawsuit. The suit, which also targets Meta’s Facebook platform and Snap’s Snapchat, alleges that the companies’ services “contain unique product features which are intended to and do encourage addiction, and unlawful content and use of said products, to the detriment of their minor users.”

Filed in California state court, the suit is one of many brought against large social media companies. But comparatively few of these have covered Discord and Roblox, both of which are popular with young users. (Over half of US children were on Roblox as of 2020.) It comes shortly after California Governor Gavin Newsom signed a law requiring sites to change how they treat users under 18 and follows a UK coroner directly blaming social media for a teenager’s suicide, albeit not in a way that carries clear legal consequences.

The Social Media Victims Law Center filed the suit on behalf of a 13-year-old girl identified as S.U., who began using Roblox around age nine. S.U. was allegedly contacted on Roblox by an 18-year-old user who encouraged her to join him on Discord, Instagram, and Snapchat. The suit claims the communication led to a “harmful and problematic dependence” on electronic devices that damaged her mental health, while the 18-year-old encouraged S.U. to drink, send explicit photos, and engage in other harmful behavior. In 2020, S.U. allegedly attempted suicide.

The claims against each platform are different, but some are drawn from familiar sources, including leaked details about Meta’s internal research on how Facebook and Instagram affect teenagers’ self-esteem, as well as numerous reports that underage users can access harmful content. For Discord and Roblox specifically, the complaint singles out the platforms’ alleged failure to stop adults from messaging children without supervision.

“But for Roblox’s marketing to children ... S.U. would not have been exposed to defendant Roblox’s inherently dangerous and defective features.”

“But for Roblox’s marketing to children, representations of safety, and failure to warn of harms known to Roblox and arising from its direct message products and capabilities ... S.U. would not have been exposed to defendant Roblox’s inherently dangerous and defective features,” says the case. “But for Discord’s defective and/or inherently misleading safety features and, independently, its failure to conduct reasonable verification of age, identity, and parental consent, S.U. would not have been exposed to defendant Discord’s inherently dangerous and defective features.”

Like most cases against social networks, the suit seeks to hold the services responsible for defective product design — and in the process, circumvent Section 230 of the Communications Decency Act, which shields sites and apps from liability for user-generated content and communications. An Oregon judge allowed a similar case against Omegle to proceed in July, arguing that the service could have done more to prevent adults and minors from contacting each other.

This case and others raise questions about the balance between protecting children and preserving privacy online. The suit takes aim at Discord and Roblox, for instance, for not verifying users’ ages and identities. But doing so with sufficient rigor could require effectively ending online anonymity on major platforms, an issue that has dogged attempts to make porn sites verify users’ ages in the UK. For now, this suit’s future will likely depend on a series of other US legal decisions — including an upcoming Supreme Court case over recommendation algorithms.