Meta Platforms has announced the global expansion of its Teen Accounts feature on Facebook and Messenger. Initially rolled out last fall in select regions, including the U.S., U.K., Australia, and Canada, this expansion sets the stage for enhanced safety and parental controls tailored specifically for teenage users.
Enhanced Safety Features
The Teen Accounts come embedded with a suite of features aimed at safeguarding young users from inappropriate content and unwanted contact. Teens are automatically placed into a controlled environment where content exposure is closely monitored. Restrictions are applied to messaging, allowing teens to receive messages solely from individuals they have previously communicated with or follow. Friends remain the only viewers and responders to user stories; similarly, tags, mentions, and comments are limited to a teen's circle of followers or friends.
Additionally, to address concerns surrounding excessive app usage, teens will receive reminders to log off after an hour's use each day. Quiet Mode is also in place, automatically activating overnight to enforce digital downtime.
Parental Control and Safety Protocols
For those under 16, parental oversight is a foundational component of the account settings. Any changes made must receive parental approval, ensuring an additional layer of security and control. The introduction of these settings reflects Meta's bid to provide a transparent and collaborative approach to managing teen interactions within its digital ecosystem.
While these accounts are designed with extensive protections, Meta remains responsive to the challenges that arise. A whistleblower’s research highlighted potential exposure to harmful content on Instagram, prompting the development of refined protective measures. According to Meta, improvements have been made to reduce the incidence of teens encountering distressing materials, such as those related to suicide, self-harm, and sexually inappropriate content.
School Partnership Initiative
Alongside these measures, Meta is fostering collaborative efforts with educational institutions through its School Partnership Program. This initiative invites U.S. middle and high schools to join a network where educators can elevate safety issues like bullying directly to Instagram for urgent review and removal. As participants, schools gain access to vital reporting and educational resources and receive a distinctive banner signaling their official partnership status.
This broad suite of changes underscores Meta's ongoing commitment to nurturing a safer digital environment for teens. By implementing rigorous protections and establishing meaningful partnerships, Meta strives to address the multifaceted challenges of teen mental health and safety within its platforms.



