Skip to main content

Instagram head admits platform has a problem policing self-harm posts

Instagram head admits platform has a problem policing self-harm posts

/

It’ll introduce sensitivity screens this week

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

Photo by Amelia Holowaty Krales / The Verge

Instagram hasn’t effectively protected users from self-harm and suicidal content, Adam Mosseri, the head of the company, admits in an op-ed today, and he says that the company is working to remedy that.

Mosseri writes in The Telegraph that the death of 14-year-old Molly Russell in 2017 moved him and pushed the company to take a deeper look at its self-harm content screening. Russell died by suicide, and her family says she followed multiple self-harm and suicide Instagram accounts, which led her to kill herself. After hearing Russell’s story, and after UK health secretary Matt Hancock issued a warning to tech giants about their handling of these issues, Mosseri and his team began a “comprehensive review” of how the platform handles self-harm content.

Hancock threatened to use the law to force tech companies to protect children against this content

He says the platform bans posts that promote self-harm or suicide, but that it struggles to independently detect and police them all. “The bottom line is we do not yet find enough of these images before they’re seen by other people,” he writes. Right now, the platform mostly relies on the community to report offending posts, but the company is investing in technology to better identify these images before they reach followers. It’s also working to make them less discoverable.

According to Mosseri, the company has trained engineers and content reviewers on how to find these posts and has put measures in place to stop related image, hashtag, account, and type-ahead suggestions. Mosseri makes clear that Instagram will still allow posts that talk about mental health struggles and suicide as long as they don’t promote it, but the platform won’t recommend those posts through search, hashtags, or the Explore tab.

Additionally, Instagram is applying sensitivity screens to all content the company reviews that involves cutting. Those screens hide the images and require users to tap through to see them.

“We deeply want to get this right and we will do everything we can to make that happen,” Mosseri writes.

If you or anyone you know is considering suicide or is anxious, depressed, upset, or needs to talk, there are people who want to help:

In the US:

Crisis Text Line: Text START to 741741 from anywhere in the USA, at any time, about any type of crisis

The National Suicide Prevention Lifeline: Call or text 988
[Note: As of July 16, 2022, anybody in the U.S. can simply dial 988 to be routed to the National Suicide Prevention Lifeline. The original number, 1-800-273-TALK (8255), will remain available as well.]

The Trevor Project: 1-866-488-7386

Outside the US:

The International Association for Suicide Prevention lists a number of suicide hotlines by country. Click here to find them.