Meta’s “Teen Accounts”: A Step Towards Safer Instagram
- Meta has introduced “Teen Accounts” on Instagram to safeguard young users’ mental health.
- The new policy makes accounts of users aged 13-17 private by default, with tighter controls over contact and content.
- Despite pressures, Meta resists checking all users’ ages due to confidentiality concerns.
- The effectiveness of these measures in addressing the impact of social media on young people’s mental health remains to be seen.
In a significant move aimed at safeguarding the mental health of its youngest users, Meta, the parent company of Instagram, announced the creation of Teen Accounts on Tuesday. This initiative is designed to provide a safer environment for underage users, who have been exposed to various risks such as addiction, bullying, body image and self-esteem issues on the photo-sharing app. Antigone Davis, Meta’s Vice President in charge of safety issues, described the Teen Accounts as a significant update. It’s designed to give parents peace of mind, she told AFP. This move comes in response to the growing global concern about the impact of social media on the mental health of young people.
Meta’s New Policy for Young Users
Under the new policy, Instagram users aged between 13 and 17 will have private accounts by default. This means that they will have tighter controls over who can contact them and what content they can view. For those in the age group of 13 to 15 who aspire to become influencers and desire a more public profile with fewer restrictions, parental permission will be required. This policy applies to both existing and new users of the platform. Davis acknowledged the magnitude of this change, stating, This is a big change. It means making sure that we do this really well. This statement underscores the company’s commitment to ensuring the safety of its young users.
The pressure on social media giants like Meta has been mounting over the past year. In October, about 40 U.S. states filed a complaint against Meta’s platforms, accusing them of harming the mental and physical health of young people, due to the risks of addiction, cyber-bullying, or eating disorders. Australia is also planning to set the minimum age for its social networks at between 14 and 16.
Age Verification and Privacy Concerns
Despite these pressures, Meta has resisted checking the age of all its users, citing confidentiality concerns. When we have a strong signal that someone’s age is wrong, we’re going to ask them to verify their age, but we don’t want to make three billion people have to provide IDs, Davis explained. She suggested that age checks could be more effectively carried out at the level of the smartphone’s mobile operating system, such as Google’s Android or Apple’s iOS. They actually have significant information about the age of users. And if they were to share that broadly across all the apps that teens use, that would provide peace of mind for parents.
However, it remains unclear whether these new protections will be enough to reassure governments and online safety advocates. Matthew Bergman, founder of the Social Media Victims Law Center, expressed his concerns about Instagram’s addictive nature. Instagram leads kids down dangerous rabbit holes, where they are shown not what they want to see, but what they can’t look away from, he said.
Bergman’s group represents 200 parents whose children committed suicide after being encouraged to do so by videos recommended by Instagram or TikTok. He also highlighted the numerous cases where young girls have developed serious eating disorders due to their exposure to content on these platforms.
In response to these concerns, Meta has taken measures to prevent the promotion of extreme diets on its platforms, among other steps. Bergman described these measures as baby steps, but nevertheless, steps in the right direction. He believes that the key to addressing these issues lies in making these platforms less addictive, which could potentially make them a little less profitable.
In conclusion, Meta’s introduction of Teen Accounts is a significant step towards ensuring the safety and well-being of its youngest users. However, it remains to be seen whether these measures will be sufficient to address the growing concerns about the impact of social media on the mental health of young people. The company’s commitment to the safety of its users is evident, but the effectiveness of these measures will only be revealed in time.



Post Comment