Meta Platforms updates content policies to protect teens from harmful content
The company led by Mark Zuckerberg will block posts related to suicide and self-harm on Facebook and Instagram.
Meta Platforms announced new measures to protect young people from certain harmful content posted on its two main platforms, Facebook and Instagram, with the goal that "teens have safe and age-appropriate experiences."
Through a blog post, Meta Platforms confirmed the update of its content control policy, with more than 30 tools that reinforce safety for young people who use the applications.
The content that the company is targeting is related to issues such as suicide, self-harm and eating disorders, among others. These topics that will no longer appear on the feeds or in stories for these social media users.
Meta Platforms highlights that it will "hide more types of content for teens on Instagram and Facebook, in line with expert guidance," as well as "automatically placing all teens into the most restrictive content control settings on Instagram and Facebook and restricting additional terms in Search on Instagram" and "require teens to update their privacy settings on Instagram in a single tap with new notifications."
Lawsuits against Meta for allowing addictive content for minors
This update comes at a time when social networks are in the spotlight for causing harm and addiction in minors. In recent months, the company chaired and directed by Mark Zuckerberg was targeted in a class action lawsuit filed by the attorneys general of 33 states, which alleged that Meta spreads content and applications with addictive effects on minors.
In March 2023, Arkansas officials, backed by Governor Sarah Huckabee Sanders, accused Meta and TikTok of "hooking young users." They filed three lawsuits and relied on the Deceptive Trade Practices Act, ensuring that both used fraudulent strategies to attract minors with content that was harmful to them.