New court documents from a Utah lawsuit allege that Meta executives were aware the company’s apps have “addictive features,” and that Meta misrepresents its apps as safe for children.
The documents say the apps owned by the company have algorithms that “push users into a ‘rabbit hole’ of content” while Instagram’s algorithm takes children “into negative spirals.”The complaint previously had passages of text covered up, but the entire document is now public.
The latest information about what Meta might have known about its young users comes as a U.S. Senate hearing this week brought national attention to complaints against social media companies.
During the hearing, Meta CEO Mark Zuckerberg took the extraordinary step of standing up and turning around in the middle of his testimony and apologizing to the victims and families in the Senate hearing room who say they have been harmed by social media.
“I’m sorry for everything you’e all gone through, it’s terrible,” Zuckerberg said. “No one should have to go through the things that your families have suffered.”
Utah Gov. Spencer Cox and Attorney General Sean Reyes have spurred legal action against Meta and other social media companies because they say they are knowingly harming young people.
The Utah Attorney General’s Office on behalf of the Division of Consumer Protection filed a suit against Meta on Oct. 24, to “stop the harms being suffered by Utah’s children caused by Meta’s ongoing violations of the Utah Consumer Sales Practice Act” and parts of Utah code.
Utah’s allegations against Meta were made clearer after the unredacted suit was released.
Meta owns popular apps for teenagers, including Instagram. Close to 70% of teen girls use Instagram and 62% of teens use the app, the documents say.
At the heart of Utah’s complaint are the allegations that Meta makes “addictive features,” markets them to children and misrepresents platforms as safe for minors. “Meta makes these misrepresentations knowing that its Platforms are designed to ensnare children, that users are likely to be exposed to harmful content, and that its design features, like the availability of plastic-surgery camera filters to displaying ‘likes,’ are harmful to users in multiple and intense ways.”
“Just as litigation effectively spurred change by the opioid pharmaceutical industry and Big Tobacco, we expect this lawsuit will inspire Meta to improve its child safety practices,” Cox said in a release announcing the suit. “Regulating social media companies to protect minors is not a partisan issue, and most people across the political spectrum agree we cannot allow addictive algorithms and deceptive practices to continue harming our children. This action shows we will continue to fight for the mental health and well-being of our kids.”
Meta did not immediately return a request for comment.
Here’s a closer look at some of the key allegations in the lawsuit.
Utah alleges Meta knew about its harm
The complaint reveals that Utah alleges Meta “secretly maintained a parallel set of internal data showing shockingly high rates of harms” while the company “routinely published misleading reports that purported to show impressively low rates of harmful experiences by its users.”
“Meta’s own internal research shows its awareness that its products were, and are, harming children,” the complaint says. Later, the suit alleges that Meta tracks how much time teens spend on its platforms and “set one of Instagram’s quarterly goals to hitting two million hours of teen watch time on IGTV (short for Instagram TV), which is Meta’s long-form video feature.”
The complaint later claimed, “An internal Meta presentation from March 2019 recognizes that 55% of Meta’s users in the United States suffered from problematic use, which Meta described as ‘serious, negative impacts on sleep, relationships, work, or lives, combined with lack of control over FB use,’ and that 3.1% of its users suffered from ‘severe’ problematic use.”
In addition to alleging that Meta knew about it’s harm, the state also says that Meta hasn’t made changes it knew would reduce harm. “On multiple occasions, Meta has explicitly considered and explicitly rejected design changes after finding that those changes would decrease the danger of harms, but also decrease engagement—and therefore Meta’s profits.”
Later the suit states, “internal documents reveal that Meta has been presented with various proposals to mitigate its Platforms’ harms to children. It could have also just removed the addictive design features and algorithms it implemented. It has repeatedly failed to take any effective action—multiple times at the direction of senior leaders.”
The state also wrote in the suit that “Meta is aware that teens are more susceptible to compulsive and excessive use of its Social Media Platforms.”
“In 2020, internal Meta statistics showed that teens are 40% more likely to spend five or more hours on Instagram per day than non-teens,” the complaint states. “Over the course of a week, teens are also 40% more likely to spend more than twenty-eight hours on Instagram than non-teens.”
Utah alleged that Meta is aware of its negative impact on teen mental health.
“Meta knows that the compulsive and excessive social media use it actively promotes and profits from is detrimental to children. Meta also knows that children are facing a mental-health crisis. According to Meta’s own studies, ‘82% of teens have felt at least one emotional issue in the past month. One in five has thought about suicide or self-injury,’” the suit alleges. “In Meta’s own words: ‘Teens blame Instagram for increases in the rates of anxiety and depression among teens.’”
The suit also states that Meta studied brain development to increase engagement on the app.
“Meta has spent significant time studying the neurological development of adolescents, conducting over eighty internal studies on the subject. Meta has used this extensive internal research ‘to inform product strategy’ and increase teen engagement” the filing claims.
The algorithm, advertisements and ‘addictive features’
“Meta develops and implements features intended to trick children into spending as much time as possible on its Social Media Platforms in order to maximize profits,” the complaint states. The allegedly harmful and “manipulative design features” listed in the complaint include infinite scroll, autoplay, the personalization algorithms, alerts and reels.
“Because the algorithms are consistently learning more about the user” while they scroll looking “for more related and dopamine-inducing content to display to keep the user hooked, the algorithms tend to push users into a ‘rabbit hole’ of content through ‘preference amplification.’”
The suit also claims that since Meta’s algorithms “function on a user-by-user basis,” those who use the app “often find themselves unwittingly and infinitely scrolling down a bottomless pit of content for hours.”
Meta’s employees allegedly “acknowledged how Instagram’s ranking algorithm takes youth ‘into negative spirals & feedback loops that are hard to exit from.’”
The algorithm also feeds up targeted advertisements, the complaint states. “When Meta succeeds in maintaining a user’s interest through its content-personalization algorithms, Meta can collect even more data and serve even more targeted advertisements to the user. Meta can then also apply its personalization algorithms to track ads with which users engage and narrowly tailor a user’s feed with more targeted advertising.”
Notifications and infinite scroll
Utah’s suit claims that Instagram’s notifications are designed to pull children back in.
“The alerts enabled by default are carefully designed by Meta to increase engagement by children by taking advantage of well-understood neurological and psychological phenomena, including using sounds and vibrations to trigger sudden dopamine releases and to exploit connection overload stressors provoked by social media notifications,” the complain alleges.
In addition to notifications, Utah’s suit pointed toward infinite scroll, which refers to the ability to scroll indefinitely on Instagram.
“Meta does not allow users to turn off infinite scroll; so long as they choose to use Facebook or Instagram, they are stuck with it,” the complaint alleges.
Utah’s claims about inappropriate content on the app
The state alleges that Meta is aware “it serves content to children that is not appropriate for their age.”
According to Utah’s complaint, Meta increases engagement by giving children “psychologically and emotionally gripping content, including content related to eating disorders, violent content, content encouraging negative self-perception and body image issues, bullying content, and other categories of content known by Meta to inspire intense reactions from users.”
Internal Meta research from 2018 “showed that when a user followed accounts posting anorexia-related content, the personalization algorithms would recommend additional anorexia-related content,” the filing states.
A 2020 internal report around child’s safety on the platforms from Meta allegedly found that “Instagram had ‘minimal child safety protections.’”
The next year in 2021, the suit alleges that “Meta received external research completed on social media platforms including Instagram and Facebook; this research revealed that children as young as nine years old were using Meta’s Platforms and having online sexual interactions with adults.”
The state has asked for a trial by jury to hear the claims brought up in the suit.