Instagram has announced that it will now start warning users before disabling their accounts. According to the social media platform, users will receive in-app notifications when their accounts are at the risk of getting disabled for violating the community guidelines about nudity, pornography, bullying, harassment, hate speech, drug sales, and terrorism.
The Facebook-owned is well known for in disabling accounts without any prior notice. The company has received a lot of backlash for this, hence the new policy. The new policy is coming after the company rolled out its AI-powered anti-bullying features.
Users will also get a chance to appeal deleted content and disabled accounts from within the app instead of having to go through the Help Center.
Instagram also said that in addition to removing accounts with a certain percentage of violating content, it will “remove accounts with a certain number of violations within a window of time”; similar to how policies are enforced by its parent company Facebook.