Skip to main content

To protect teens, YouTube’s limiting some video recommendations

To protect teens, YouTube’s limiting some video recommendations


YouTube’s new product updates come a week after Meta was sued for contributing to a youth mental health crisis.

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

Illustration of a YouTube logo with geometric background
Illustration by Alex Castro / The Verge

Starting November 2nd, YouTube will impose restrictions on how often teens receive repeated video recommendations related to sensitive topics like body image, the company announced on Thursday.

YouTube says the new safeguards are the result of its partnership with the Youth and Families Advisory Committee, which consists of psychologists, researchers, and other experts in child development, children’s media, and digital learning. For years, the committee has advised YouTube on the potentially harmful mental health effects repeated exposure to certain content online can have on teenagers.

“A higher frequency of content that idealizes unhealthy standards or behaviors can emphasize potentially problematic messages—and those messages can impact how some teens see themselves,” Allison Briscoe-Smith — a clinician, researcher, and Youth and Families Advisory Committee member — explains in a press release. “Guardrails can help teens maintain healthy patterns as they naturally compare themselves to others and size up how they want to show up in the world.”

YouTube worked with the advisory committee to identify categories of videos that could potentially pose a problem if viewed repetitively. Now, teen viewers will no longer receive repeated video recommendations for content that “compares physical features and idealizes some types over others, idealizes specific fitness levels or body weights, or displays social aggression in the form of non-contact fights and intimidation.”

YouTube also announced other product updates related to the well-being of teens, including more frequent and noticeable “take a break” and bedtime reminders. YouTube has also turned its crisis resource panel, which connects users searching for queries like “eating disorders” with live support from crisis service partners, into a full-page experience. The panels will also now feature more visually prominent resources for third-party crisis hotlines, while also trying to redirect search queries with suggestions for topics like “self-compassion” or “grounding exercises” instead.

In addition, YouTube says it’s working with the World Health Organization (WHO) and Common Sense Networks to develop educational resources for parents and teens. It’ll include guidance on how to create videos online safely and with empathy as well as how to respond to comments and more.

By rolling out these updates now, YouTube is possibly trying to protect itself after dozens of states filed a lawsuit against fellow social network Meta last week for contributing to a youth mental health crisis. In the complaint, the states accuse Meta of knowingly rolling out features that promote harmful behaviors, including failing to get rid of content related to disordered eating and bullying.

Meta isn’t the only social network to get in legal trouble this year, either. In June, a Maryland school district sued Meta as well as Google, Snap, and TikTok owner ByteDance for reportedly contributing to a “mental health crisis” among students.

“Over the past decade, Defendants have relentlessly pursued a strategy of growth-at-all costs, recklessly ignoring the impact of their products on children’s mental and physical health,” the lawsuit states. “In a race to corner the ‘valuable but untapped’ market of tween and teen users, each Defendant designed product features to promote repetitive, uncontrollable use by kids.”

YouTube will begin limiting repeated video recommendations to teens in the US starting on November 2nd before expanding to other countries in 2024.