Meta unveiled an array of new measures aimed at creating a more age-appropriate experience for teens on its platforms on Tuesday in an effort to protect young users.
These adjustments come after years of concern about social media’s potential detrimental effects on mental health, particularly among young people.
The social media giant based in Menlo Park, California, said it will start hiding inappropriate content from teenagers’ accounts on Instagram and Facebook, including posts about suicide, self-harm and eating disorders.
“We want teens to have safe, age-appropriate experiences on our apps. We’ve developed more than 30 tools and resources to support teens and their parents, and we’ve spent over a decade developing policies and technology to address content that breaks our rules or could be seen as sensitive. Today, we’re announcing additional protections that are focused on the types of content teens see on Instagram and Facebook.”
The core focus of the update is on content moderation. Meta will now actively hide posts discussing self-harm, eating disorders, and other potentially harmful topics from teen users’ feeds and stories. This includes content shared by friends, ensuring a more controlled environment. Additionally, the Instagram search will hide results for sensitive keywords and direct users to helpful resources instead.
Recognizing the importance of privacy for teenagers, Meta is implementing new prompts encouraging them to update their settings. With a single tap, teens can activate recommended settings that restrict who can repost their content, tag them, or message them directly. This empowers them to control their online presence and create a safer space for themselves.
The new measures have received positive feedback from experts in adolescent development and mental health. Dr. Rachel Rodgers, an associate professor of psychology, highlights the importance of these changes in creating “spaces where teens can connect and be creative in age-appropriate ways.”
and Stories, even if it’s shared by someone they follow.
“Meta is evolving its policies around content that could be more sensitive for teens, which is an important step in making social media platforms spaces where teens can connect and be creative in age-appropriate ways. These policies reflect current understandings and expert guidance regarding teen’s safety and well-being. As these changes unfold, they provide good opportunities for parents to talk with their teens about how to navigate difficult topics.” – Dr. Rachel Rodgers, Associate Professor, Department of Applied Psychology, Northeastern University
The company said that they want people to find support if they need it, so we will continue to share resources from expert organizations like the National Alliance on Mental Illness when someone posts content related to their struggles with self-harm or eating disorders.
Thus, they will start to roll these changes out to teens under 18 now and they’ll be fully in place on Instagram and Facebook in the coming months.
Here’s more detail on how today’s updates expand on our existing protections, in line with feedback from experts.
“Parents want to be confident their teens are viewing content online that’s appropriate for their age. Paired with Meta’s parental supervision tools to help shape their teens’ experiences online, Meta’s new policies to hide content that might be less age-appropriate will give parents more peace of mind.” – Vicki Shotbolt, CEO, ParentZone.org.