Instagram has introduced a new feature called “teen accounts” aimed at enhancing the safety and privacy of users under the age of 18.
Meta, Instagram’s parent company, announced the rollout in a statement on Tuesday, revealing that the new feature will automatically convert all accounts of users under 18 to private by default.
This means that only approved followers will be able to see their posts, stories, and other content.
Teen accounts will come with several safeguards to protect young users from potential harm.
For instance, users will only be able to receive messages from people they already follow or have previously connected with, reducing the risk of unwanted contact from strangers.
Additionally, Instagram will limit exposure to “sensitive content,” including violence and videos promoting cosmetic procedures, which may be inappropriate or harmful to minors.
To further protect teenagers, offensive words and phrases in comments and direct messages will be filtered out, creating a safer and more respectful environment.
To help teenagers manage their time on the platform, notifications will encourage them to leave the app after 60 minutes of use each day, promoting healthy screen habits and reducing the risk of addiction.
A “sleep mode” feature will also be introduced, which will mute notifications between 10 p.m. and 7 a.m. and automatically reply to messages with a note asking users to reconnect during the day.
This feature aims to promote healthy sleep habits and reduce the impact of screen time on young users’ mental health.
Users under 16 will need parental permission to adjust the default settings, ensuring that parents have control over their child’s online experience.
However, users aged 16 and 17 will be able to modify the settings without approval, giving them more autonomy over their account.
Parents will also have access to tools that monitor their children’s interactions and restrict app usage, providing them with greater control over their child’s online activities.
These tools will enable parents to set boundaries and ensure their child’s safety on the platform.
The teen accounts feature will begin rolling out in the UK, US, Canada, and Australia within 60 days, with other countries expected to follow by January 2025.
Meta also plans to expand this feature to other social media platforms under its ownership in 2025, demonstrating its commitment to protecting young users across its platforms.