Instagram is strengthening its moderation policies today and adding a new alert that will warn people who violate rules when their account is close to being deleted.
The alert will show users a history of the posts, comments, and stories that Instagram has had to remove from their account, as well as why they were removed. “If you post something that goes against our guidelines again, your account may be deleted,” the page reads.
Instagram will give users a chance to appeal its moderation decisions directly through the alert, rather than having to go through its help page on the web. Only some types of content will be able to be appealed at first (such as pictures removed for nudity or hate speech), and Instagram plans to expand the available content appeal types over time.
The change will help clarify for users why they’re in trouble and should remove the shock of suddenly finding that your account has vanished. While it’s likely that a great number of banned accounts are removed for obvious rule violations, Instagram — like its parent company Facebook — has regularly had moderation problems when it comes to nudity and sexuality, where users have had photos removed for posting pictures of breastfeeding or period blood. This update won’t prevent those mistakes (those types of photos are supposed to be allowed), but it would make appealing the decision easier.
In addition to the new alert, Instagram is also going to give its moderating team more leeway to ban bad actors. Instagram’s policy has been to ban users who post “a certain percentage of violating content,” but it’ll now ban people who repeatedly violate its policies within a window of time, too. The specifics here are all as vague as ever, as Instagram doesn’t want to offer details and let bad actors game the system, but it sounds like it could lead to fewer problematic accounts slipping through on a technicality.