Skip to main content

Instagram now lets people delete comments in bulk and is testing pinned comments

Instagram now lets people delete comments in bulk and is testing pinned comments

/

To make the platform more positive

Share this story

Image: Instagram

Instagram is launching and testing multiple new features starting today that are aimed at making the platform a more positive space. The company says it’s rolling out the ability to delete up to 25 comments at once and also block or restrict multiple accounts at the same time. The company says it’s been testing these features and found that they especially helped people with larger followings maintain a “positive environment.”

The company’s also now giving people more control over who can tag or mention them in a post. People can select whether they want everyone, only people they follow, or no one to be able to tag or mention them in a comment, caption, or story. This tool is available from the privacy settings page. Additionally, it’s testing the ability to pin comments to the top of a post. People can pin multiple comments, similarly to how YouTube allows comments to be pinned to the top of the comments section below videos. This allows creators to highlight positive conversations.

Image: Instagram

All of these features are likely most useful for people with large followings who might be targeted for abuse from random users, but the company says the tools are also designed to “fight against online bullying,” which could affect anyone, not just influencers with major reach. The company also today shared data about its content removal efforts on both Facebook and Instagram. It says it took action against 1.5 million pieces of Instagram content in both the fourth quarter of 2019 and the first quarter of 2020 because of bullying and harassment violations. Most of the content was reported by users, not proactively discovered by Facebook.

Bullying prevention has become a focus for Instagram, and it’s built multiple features to address the problem. It now warns users when they’re about to post a “potentially offensive” caption for a photo or video that’s being uploaded to their main feed, and it started using AI to filter offensive comments and proactively detect bullying in photos and captions. Still, online bullying detection is mostly reliant on users reporting cases of it. Until Instagram can automatically determine when it’s happening, people are going to have to rely on the platform’s moderation tools and reporting mechanisms to keep the platform safe and positive.