The popular social media platform Instagram announced that it would block and permanently delete the accounts of users who use hate speech, as well as racist and anti-Semitic statements, in direct messages.

"The abuse we're seeing is happening a lot in people's Direct Messages (DMs), which is harder to address than comments on Instagram. Because DMs are for private conversations, we don't use technology to proactively detect content like hate speech or bullying the same way we do in other places. But there are still more steps we can take to help prevent this type of behavior. So today we're announcing some new measures, including removing the accounts of people who send abusive messages, and developing new controls to help reduce the abuse people see in their DMs," reads Instagram's blog post announcing the changes.

Instagram calculated that from July to September 2020, the platform received more than 6.5 million complaints about the use of "hate speech" in direct messages.

Instagram Introduced ‘Recently Deleted’ Feature
The new feature will allow users to review content they’ve deleted, including reels, photos, IGTV videos, Stories, videos, and restore them.

One of Instagram's features allows users to filter comments so that certain words, phrases, or emojis that they don't want to see don't appear on the user's page.

Unlike comments under posts, it is much more challenging to follow moderation in DMs. It can be done only after a user complaint.

Previously, users were temporarily blocked for insults in direct messages, but now their accounts will be deleted. Instagram will also monitor to prevent violators from registering a new account.

In the future, Instagram will cooperate with UK law enforcement authorities on hate speech.