Meta is placing new content restrictions on teens, including hiding posts about suicide, self-harm and eating disorders, even when those posts are made by friends. The company announced Tuesday that it wants teens who use its Instagram and Facebook social media apps “to have safe, age-appropriate experiences on our apps.”
The company will place all teens “into the most restrictive content control settings on Instagram and Facebook.” The company also announced it will hide search results on Instagram related to suicide, self-harm and eating disorders. When users search, they will instead be directed to expert resources for help.
“We already hide results for suicide and self-harm search terms that inherently break our rules and we’re extending this protection to include more terms,” Meta wrote. “This update will roll out for everyone over the coming weeks.”
Among other changes for teens:
- Hiding age-inappropriate content. While resources on sensitive topics like suicide or thoughts of self-harm will continue to be available from expert organizations, Meta said it will remove such content from teens’ experiences on Instagram and Facebook, along with other age-inappropriate content. That also applies to eating disorders, restricted goods or nudity.
“We already aim not to recommend this type of content to teens in places like Reels and Explore, and with these changes, we’ll no longer show it to teens in Feed and Stories, even if it’s shared by someone they follow,” Meta said, bolding the last part.
- Restricting content recommendations. Teens are being defaulted into the most restrictive settings of Facebook Reduce Control and Instagram Sensitive Content Control.
- Simplifying the way to update settings — and more reminders to do so for teens. Settings can be updated with a single tap. For those who choose the recommended settings, Meta will automatically restrict who can view their content, tag or mention them, or include their posts in Reels Remixes. “We’ll also ensure only their followers can message them and help hide offensive comments.”
The company is making a couple of changes for everyone:
- Limiting content recommendations.
- Removing violating content, including hate speech and child exploitation.
A mountain of criticism
The changes follow those previously implemented, which include turning off location sharing for teens by default, and putting those under age 16 into the highest privacy setting in terms of who can see their friends list, what they follow, posts they’re tagged in and who can comment on their public posts.
The social media giant and others have faced a mountain of criticism for the impact on young social media users, from teens to those who are younger and manage to get accounts by lying about their age. Several states, including Utah, have passed laws with some restrictions. Utah, for instance, restricts the hours social media apps can be used and requires age verification. Social media companies are challenging those laws in court.
Per CNBC, “Complaints have dogged the company since 2021, before it changed its name from Facebook to Meta. In September of that year, an explosive Wall Street Journal report, based on documents shared by whistleblower Francis Haugen, showed Facebook repeatedly found its social media platform Instagram was harmful to many teenagers. Haugen later testified to a Senate panel that Facebook consistently puts its own profits over users’ health and safety, largely due to algorithms that steered users toward high-engagement posts.”
As Reuters reported, “Meta is under pressure both in the United States and Europe over allegations that its apps are addictive and have helped fuel a youth mental health crisis.”
The wire service reported that “attorneys general of 33 U.S. states including Utah, California and New York sued the company in October, saying it repeatedly misled the public about the dangers of its platforms.”
Meanwhile, the European Commission is also asking questions about steps Meta takes to protect children.
Some say the changes don’t go far enough to protect kids.
“Today’s announcement by Meta is yet another desperate attempt to avoid regulation and an incredible slap in the face to parents who have lost their kids to online harms on Instagram,” Josh Golin, executive director of the children’s online advocacy group Fairplay, told ABC News. “If the company is capable of hiding pro-suicide and eating disorder content, why have they waited until 2024 to announce these changes?”