Skip to main content

Facebook is working on a version of Instagram for kids under 13

Facebook is working on a version of Instagram for kids under 13

/

The Facebook-owned company says it doesn’t have a detailed plan in place yet

Share this story

Illustration by Alex Castro / The Verge

Head of Instagram Adam Mosseri confirms that a version of the popular photo sharing app for children under 13 is in the works, BuzzFeed News reports. The Facebook-owned company knows a lot of kids want to use Instagram, Mosseri said, but there isn’t a “detailed plan yet,” according to BuzzFeed News.

“But part of the solution is to create a version of Instagram for young people or kids where parents have transparency or control,” Mosseri told BuzzFeed News. “It’s one of the things we’re exploring.” Instagram’s current policy bars children under 13 from the platform.

“Increasingly kids are asking their parents if they can join apps that help them keep up with their friends,” Joe Osborne, a Facebook spokesperson said in an email to The Verge. “Right now there aren’t many options for parents, so we’re working on building additional products — like we did with Messenger Kids — that are suitable for kids, managed by parents. We’re exploring bringing a parent-controlled experience to Instagram to help kids keep up with their friends, discover new hobbies and interests, and more.”

BuzzFeed News obtained a message from an internal messaging board where Instagram vice president of product Vishal Shah said a “youth pillar” project has been identified as a priority by the company. Its Community Product Group will focus on privacy and safety issues “to ensure the safest possible experience for teens,” Shah wrote in the post. Mosseri would oversee the project along with vice president Pavni Diwanji, who oversaw YouTube Kids while she was at Google.

Instagram published a blog post earlier this week describing its work to make the platform safe for its youngest users, but made no mention of a new version for kids under 13.

Targeting online products at children under 13 is fraught not only with concerns about privacy, but legal issues as well. In September 2019, the Federal Trade Commission fined Google $170 million for tracking the viewing histories of children to serve ads to them on YouTube, a violation of the Children’s Online Privacy Protection Act (COPPA). TikTok precursor Musical.ly was fined $5.7 million for violating COPPA in February of 2019.

Facebook launched an ad-free version of its Messenger chat platform for kids in 2017, intended for kids between the ages of 6 and 12. Children’s health advocates criticized it as harmful for kids and urged CEO Mark Zuckerberg to discontinue it. Then in 2019, a bug in Messenger Kids allowed children to join groups with strangers, leaving thousands of kids in chats with unauthorized users. Facebook quietly closed those unauthorized chats, which it said affected “a small number” of users.

Update March 18th, 7:46PM ET: Added tweet from Adam Mosseri.