Instagram hasn’t effectively protected users from self-harm and suicidal content, Adam Mosseri, the head of the company, admits in an op-ed today, and he says that the company is working to remedy that.
Mosseri writes in The Telegraph that the death of 14-year-old Molly Russell in 2017 moved him and pushed the company to take a deeper look at its self-harm content screening. Russell died by suicide, and her family says she followed multiple self-harm and suicide Instagram accounts, which led her to kill herself. After hearing Russell’s story, and after UK health secretary Matt Hancock issued a warning to tech giants about their handling of these issues, Mosseri and his team began a “comprehensive review” of how the platform handles self-harm content.
Hancock threatened to use the law to force tech companies to protect children against this content
He says the platform bans posts that promote self-harm or suicide, but that it struggles to independently detect and police them all. “The bottom line is we do not yet find enough of these images before they’re seen by other people,” he writes. Right now, the platform mostly relies on the community to report offending posts, but the company is investing in technology to better identify these images before they reach followers. It’s also working to make them less discoverable.
According to Mosseri, the company has trained engineers and content reviewers on how to find these posts and has put measures in place to stop related image, hashtag, account, and type-ahead suggestions. Mosseri makes clear that Instagram will still allow posts that talk about mental health struggles and suicide as long as they don’t promote it, but the platform won’t recommend those posts through search, hashtags, or the Explore tab.
Additionally, Instagram is applying sensitivity screens to all content the company reviews that involves cutting. Those screens hide the images and require users to tap through to see them.
“We deeply want to get this right and we will do everything we can to make that happen,” Mosseri writes.