Instagram has made some changes to how it handles deleting accounts for things like hate speech, bullying, nudity, and other inappropriate content. Along with the policy update, users who are close to having their accounts removed will now get a warning notification.

Instagram announced the changes today in a blog post noting that they will help the company “quickly detect and remove accounts that repeatedly violate our policies.”

The main change is a new policy that will see accounts deleted if they make a certain amount of violations in a specific timeframe. That will be in addition to the existing violations policy.

Under our existing policy, we disable accounts that have a certain percentage of violating content. We are now rolling out a new policy where, in addition to removing accounts with a certain percentage of violating content, we will also remove accounts with a certain number of violations within a window of time. Similarly to how policies are enforced on Facebook, this change will allow us to enforce our policies more consistently and hold people accountable for what they post on Instagram.

As shown in the image above, Instagram will now send notifications to accounts that may soon be deleted with the option to appeal the violations.

We are also introducing a new notification process to help people understand if their account is at risk of being disabled. This notification will also offer the opportunity to appeal content deleted. To start, appeals will be available for content deleted for violations of our nudity and pornography, bullying and harassment, hate speech, drug sales, and counter-terrorism policies, but we’ll be expanding appeals in the coming months.

Related:

FTC: We use income earning auto affiliate links. More.


Check out 9to5Mac on YouTube for more Apple news:

About the Author