Instagram is rolling out a new feature aimed at teenagers to promote healthy screen time habits. The "Nighttime Nudges" will automatically appear on the accounts of teens who have spent more than 10 minutes on Instagram late at night, particularly in sections like Reels or direct messages. The notification will gently suggest, "Time for a break? It's getting late. Consider closing Instagram for the night." Notably, teenagers won't have the option to disable these nudges.

Instagram Will Restrict Direct Messaging Between Adults and Teens
The social media platform will limit interaction between teenagers and adults. According to the company, adult users will soon be unable to send direct messages to teens who do not follow them.

This move is part of Instagram's ongoing efforts to support younger users in managing their time on the platform. The company believes that promoting better sleep habits is crucial for teens. While Instagram has not clarified whether this feature will be enabled for all teenagers or only those under 18, the intention is to encourage those who do not use time management tools to log off the app at night.

Instagram already offers features like "Take a Break" reminders and "Quiet Mode" to help users and parents manage screen time. These tools enable teens to mute notifications, automatically reply to messages, and inform friends and followers that they are unavailable, promoting healthier interactions with the app.

Instagram’s New Feature to Protect Users from Unwanted Image and Video DMs
Instagram believes that these changes will prevent users from receiving unwanted visual content from people they don’t follow, and also hinder repetitive messages from strangers.

The introduction of "Nighttime Nudges" comes amid growing concerns about the impact of social media on teenagers' mental health and well-being. Meta, Instagram's parent company, has faced increasing scrutiny and regulatory pressure regarding child safety and content moderation on its platforms. Recently, Meta announced plans to automatically restrict certain types of content for teen users, including posts related to self-harm, graphic violence, and eating disorders.

This move is in line with a broader industry trend of social media companies implementing features and controls to promote safer and more responsible usage, particularly among younger users.