Instagram is introducing a series of enhanced privacy features and parental controls for teenagers. The new changes, set to affect users aged 13 to 15, promise to offer “built-in protections” designed to create a safer environment for teenagers on the platform while giving parents greater oversight.
Social media platforms, including Instagram, have been under increasing pressure globally to safeguard young users from harmful content. While Meta, Instagram’s parent company, has introduced these changes as part of its broader initiative, critics argue that the responsibility still lies heavily with the company itself, rather than simply shifting it to parents and teenagers.
Key Features of Instagram’s New Teen Accounts
The new teen accounts come with a host of default settings that will make Instagram a more secure space for teenagers. Most notably, teenagers’ posts will be set to private by default, ensuring that their content is only viewable by approved followers. This change is aimed at preventing unwanted interactions with strangers and unsolicited exposure to inappropriate content.
Additionally, teenagers will need to manually approve any new followers, further ensuring that only those they trust can view their content. These privacy settings are locked in place unless a parent or guardian provides oversight of the account or when the teenager turns 16.
Parents who choose to supervise their child’s Instagram activity will have access to some important tools. They will be able to see who their child is messaging and the topics they are interested in, though the content of private messages will remain off-limits. Instagram aims to strike a balance between parental supervision and respecting teenagers’ privacy.
Addressing Safety Concerns
While these changes have been welcomed by some, they come amid rising concerns about the effectiveness of social media platforms in protecting young users from harmful content. The UK’s National Society for the Prevention of Cruelty to Children (NSPCC) has called the new measures a “step in the right direction” but warns that more must be done. Rani Govender, NSPCC’s online child safety policy manager, emphasized that Meta needs to implement more proactive measures to prevent harmful content from appearing on Instagram in the first place.
Meta, however, maintains that these updates are part of its ongoing efforts to create a safer experience for teens. The company has described the new teen accounts as a “guided experience” for both young users and their parents, aimed at providing additional peace of mind.
Instagram plans to roll out these changes gradually, beginning in the UK, US, Canada, and Australia. The European Union is set to follow later in the year, and the platform will begin transitioning millions of current teenage users to these new settings within 60 days of notifying them.
Parental Responsibility and Concerns
Although Instagram is equipping parents with more tools to supervise their children’s activity, concerns remain about the level of engagement from parents. According to a report by Ofcom, the UK’s communications regulator, many parents are hesitant to intervene in their children’s online activities, despite available controls. Meta’s senior executive, Sir Nick Clegg, highlighted this issue in a recent talk, stating that even when parental controls are in place, parents often fail to use them effectively.
Further skepticism stems from the tragic case of Molly Russell, a 14-year-old who took her own life after viewing harmful content on Instagram. Her father, Ian Russell, expressed concerns that while Meta is skilled at making announcements, the real test will be how effectively these new measures are implemented and enforced.
Challenges with Enforcement
Meta’s ability to enforce these new rules will be critical. Although Instagram already uses tools like age verification and video selfies to confirm a user’s age, questions remain about the platform’s ability to prevent tech-savvy teens from bypassing these restrictions. Analysts like Matt Navarra caution that while these changes are promising, teenagers often find ways to circumvent online safeguards.
As governments worldwide push for more robust protections, Instagram and other social media platforms must demonstrate their commitment to addressing these concerns. The UK’s Online Safety Act, which requires platforms to remove harmful content, could levy heavy fines on companies that fail to comply, though the full rules will not take effect until 2025.
Ultimately, while these new features shift more control to parents, they highlight the ongoing challenge of safeguarding young users in the digital age. Meta’s updates mark a positive step forward, but more comprehensive actions are needed to fully protect children from the dangers lurking on social media platforms.