Meta Platforms has announced the expansion of its “Teen Accounts” feature to Facebook and Messenger, a move aimed at enhancing the safety and privacy of young users on its platforms. This development comes as part of the company’s ongoing efforts to address growing concerns over the risks posed to teens in the digital space, amid a regulatory push for greater online protections.
Originally introduced on Instagram in 2023, the Teen Accounts feature now brings enhanced privacy settings and parental controls to Facebook and Messenger. These tools are designed to give parents more oversight over their children’s online activity while offering teens more control over who can contact them and what content they can see. The new privacy features include restricted direct messaging, limiting visibility of teen profiles, and filtering out harmful or inappropriate content based on age and interests.
This expansion comes at a time when lawmakers in the U.S. are ramping up pressure on social media companies to take more responsibility for the safety of young users. The Kids Online Safety Act (KOSA) is one example of the proposed legislation that aims to impose stricter regulations on how tech companies protect children and teenagers. The law seeks to ensure that social media platforms create environments that are not only age-appropriate but also shield young users from harmful content and potential exploitation.
Meta’s move to enhance safety features is a response to the increasing scrutiny surrounding its platforms. While the company has faced criticism for not doing enough to protect young users, this effort signals a commitment to complying with regulatory demands and fostering safer digital spaces. The hope is that by giving both parents and teens more control, Meta can help mitigate the risks teens face while engaging with social media.