With teen suicides nearly quadrupling in Utah from 2007 to 2015, the state launched a youth suicide task force in 2018 with Rep. Steve Eliason, R-Sandy, and then Lt. Gov. Spencer Cox as chairs.
A few weeks before the task force was announced, Cox traveled to Tremonton for what he thought would be a “simple” speaking assignment at a community suicide prevention meeting. He led with his prepared talking points — but then, he felt compelled to divulge his deeply personal experience with suicide. Something that he hadn’t really talked about for nearly three decades.
Growing up in a conservative, agrarian town, Cox reflected on being a kid with divorced parents and thick glasses, especially in middle school. “The first week, these strapping young boys grabbed me in the hall and stuffed me in a garbage can,” he said. At the time, Cox said, he began thinking “what it would be like if I wasn’t here anymore, and how much better off everyone would be if I wasn’t here.” He then pleaded with the crowd to inspire hope in each other.
In the midst of looking at the teen mental health crisis, the connection between many of things that are precursors to suicide — depression, bullying and unhealthy comparisons — are linked to social media use. This gave Cox, and others, a reason to focus on what is now seen as one of the primary reasons for a rise in the number of teens struggling with anxiety and depression — social media.
And if it can be said that Utah is waging a war on social media, Cox is the general leading the charge.
Since Cox became governor, he’s continued to prioritize finding ways to address the teen mental health crisis, which researchers say is unprecedented in its size and scope. A Centers for Disease Control and Prevention report from last spring found that the crisis is still ongoing, with increases in poor mental health and suicide-related behaviors.
When researchers ask what ails teens, they consistently point toward a specific cause.
“When we look at surveys of what kids themselves are saying is causing their distress, the No. 1 cause is usually social media,” Zach Rausch, lead researcher for Jonathan Haidt’s upcoming book and associate research scientist at New York University, said. “And so that is really the reason for this alarm.”
It’s a problem lawmakers have identified, too — and haven’t been bashful about addressing.
Before last year’s legislative session, Rep. Jordan Teuscher, R-South Jordan, came across research showing the unique harms social media has on young women.
“That was coupled with a number of constituents that had called me and said they were doing everything they could to try to help their kids navigate through social media space,” Teuscher said in a phone interview. “And in their words, the algorithms were just more than they could overcome.”
Teuscher teamed up with Sen. Mike McKell, R-Spanish Fork, Sen. Kirk Cullimore, R-Draper, and Aimee Winder Newton, senior adviser to Cox and director of the Office of Families, to find a solution. Cox held a summit on the issue and soon Utah’s unique social media laws were born.
The genesis of Utah’s unique social media laws
In 2023, the Utah Legislature passed the Social Media Regulation Act, sponsored by Teuscher and McKell. The legislation required social media companies to verify the age of their users — if a user is under 18, they would need parental consent. It also created a default curfew setting that blocks overnight access to minors’ accounts, which parents can choose to override. It forbids social media companies from collecting children’s data or targeting their accounts with addictive designs or features.
During this year’s legislative session, the state pushed the implementation date back from March 1 to Oct. 1. On Monday morning, lawmakers proposed amendments to the act, such as requiring social media companies to have supervisory tools available for minors’ accounts.
If these bills pass, the law would no longer require a default curfew or that children would need parental consent to get on social media. The law would still make it so social media companies cannot sell children’s data without parental consent. It would also provide children and their parents or legal guardians with the ability to hold social media companies liable for harms caused by their algorithms.
The initial passage of the laws was something of a litmus test. Lawmakers promised to work with social media companies in the interim and they also watched what happened across the national landscape.
“Certainly there were a number of states that copied Utah’s approach and we learned from it,” Teuscher said. “Some of those states didn’t take the approach to push out that implementation date and so we saw what happened when they modeled some of our legislation and then they were sued. We saw the arguments that were brought up and the way the courts were signaling around the nation that certain First Amendment issues hadn’t been tested before.”
In other words, Utah became the “default that states are looking at,” he said.
New York was one of the states that followed Utah’s example. New York state Sen. Andrew Gounardes and Assemblymember Nily Rozic announced a pair of bills in fall 2024 known as The Stop Addictive Feeds Exploitation (SAFE) for Kids Act and The New York Child Data Protection Act.
The SAFE for Kids Act would allow parents to impose a curfew between midnight and 6 a.m. for children on social media platforms. While the other act would prohibit online sites from selling personal data of any user below the age of 18 unless they consent to it (parents would have to consent for kids under 13) or it is absolutely necessary.
“Our kids are in crisis, and the adults in the room need to step up,” New York Gov. Kathy Hochul said in a release about the bills. “The statistics are extraordinarily disturbing: teen suicide rates are spiking, and diagnoses of anxiety and depression are surging.”
Like other laws across the country, the Beehive State’s bills haven’t gone unchallenged. The tech industry group NetChoice and the Foundation for Individual Rights and Expression have taken to the courts to challenge Utah’s law in suits filed this winter. Both NetChoice and FIRE cited alleged violations of the First Amendment and the 14th Amendment as their causes of concern.
“With NetChoice v. Reyes, we are fighting to ensure that all Utahns can embrace digital tools without the forceful clutch of government control,” Chris Marchese, director of the NetChoice Litigation Center, said. “Now that these tools are prominent in our lives and important for our economy, young people should learn how to harness their power while developing healthy and safe habits.”
On the back of NetChoice filing its suit, Cox indicated that the state was prepared to defend its laws among legal challenges. “We will vigorously defend these laws, we are prepared for it,” he said during a press conference in late December.
The amendments lawmakers added this year would make the laws more constitutionally sound, Teuscher explained and added, “I think we are expecting these companies to continue to challenge it because it is such a new area of the law. A lot of the federal laws that are based off of this were written back in the ‘90s when social media wasn’t even a dream and even the internet wasn’t really at the top of people’s minds.”
At their core, the legislation attempts to provide more protections for children on social media than parents are currently able to give them.
Teuscher related that they did a focus group with parents, youth and representation from some social media companies. “We heard from the youth how it made them feel when they started to get addicted and the algorithms seemed to be influencing them in certain ways, that it changed their perception and their self-worth.”
Opponents to the kinds of measures Utah lawmakers have proposed will often ask if it’s fair to point toward social media as the cause for the mental health crisis currently affecting teens. But Rausch said it’s absolutely a social media issue.
Smartphones were introduced to the public in 2007 and social media companies like Facebook and Twitter — and later, Instagram — began growing in popularity and accessibility, Rausch said. By 2010, smartphones had acquired front-facing cameras. “It is the tie between smartphones and social media that happened right in this period that really set off a chain reaction where adolescent social life moved entirely onto these phones and into this new hyperviral universe.”
At the same time, teen mental health deteriorated. “In all of my research, there’s been a very sharp rise that begins around this time period, between 2010 and 2015, where rates of low self-esteem, rates of anxiety and depression and more behavioral measures are going up,” Rausch said.
In Rausch’s estimation, youth cutting back on social media would not suffice to solve the issue.
“Part of the reason that we advocate for larger-scale bans on these devices is that we need to address this collectively, because if every other 13-year old girl is no longer using social media platforms, the pressures to be on those platforms go down, the cost of not being on it goes away and the benefits go up,” Rausch said.
Critics of Utah’s attempts to regulate social media have also emerged, like Caden Rosenbaum, senior policy analyst at Libertas Institute. While Rosenbaum said he could see why lawmakers would take this path, he disagrees with the approach.
Speaking about the original set of bills, Rosenbaum said his biggest concern is around age verification. “The biggest issue was the age verification mechanism created a cybersecurity problem, not just for people, but for kids,” he said, explaining that if age was verified using an ID, then he had concerns about what would happen if that data was compromised. This parlayed into a First Amendment concern as well.
“The issue is that you would basically prevent people from speaking because they would have to give their ID, and so they might decide not to — it’s prohibitive,” Rosenbaum said.
Rosenbaum said he thought there was a different path forward: one that focuses more on education than on prohibition. Since the internet isn’t going anywhere, he thinks lawmakers should concentrate on helping educate parents and children. “We should lean into education, we should lean into parents rights, we should encourage parents to teach their kids how to use social media or teach their kids how to avoid it if that’s what they want to do,” he said. “We shouldn’t just be passing unconstitutional laws trying to hamstring social media.”
Utah’s suits against Meta and TikTok
The legislative efforts in Utah are just one piece of the puzzle. The state has also filed two separate lawsuits: one against TikTok and another against Meta. Both were filed in the fall of last year by the Utah Attorney General’s Office on behalf of the state’s Division of Consumer Protection. When the suits were first filed, passages were obscured from view. Earlier this year, many of these passages were unredacted, so Utahns can have a clearer picture of why the state is suing these social media companies.
The state’s suit against TikTok alleged that the company knew it was causing harm to teenagers and did not act. The suit claimed that TikTok “acknowledges it ‘utilizes many coercive design tactics that detract from user agency such as infinite scroll, constant notifications, and the ‘slot machine’ effect.’”
The suit also claimed that “in an internal digital well-being product safety report, TikTok has also admitted that the design of its application can trigger habit-forming behaviors that harm mental health.” Some of these harms to mental health included “loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy and increased anxiety.”
As for how Utah got its hands on internal documents, the Utah Department of Commerce’s Division of Consumer Protection asked for them.
“The documents were turned over by TikTok in response to a subpoena issued by the Division of Consumer Protection as part of an ongoing investigation. The Division has the authority to subpoena documents, and companies are required under Utah law to comply,” Katie Hass, director of the Division of Consumer Protection said.
The state alleged that TikTok purposefully sends children push notifications when their attention should be elsewhere. “TikTok sends these notifications directly to users’ mobile phones, nudging younger users to engage with the app during both school and sleeping hours: ‘(We) send notifications to users during the school day and in some cases, up until midnight, which could interfere with sleep.’”
A TikTok spokesperson issued the following statement in response to a request for comment, “TikTok has industry-leading safeguards for young people, including an automatic 60-minute time limit for users under 18 and parental controls for teen accounts. We will continue to work to keep our community safe by tackling industry-wide challenges.”
In the case of the state’s suit against Meta, the state alleges that Meta makes “addictive features,” markets them to children and misrepresents its platforms as safe for children. “Meta makes these misrepresentations knowing that its Platforms are designed to ensnare children, that users are likely to be exposed to harmful content, and that its design features, like the availability of plastic-surgery camera filters to displaying ‘likes,’ are harmful to users in multiple and intense ways.”
The state’s complaint also claims that Meta is aware of the ways its apps cause harm. “Meta knows that the compulsive and excessive social media use it actively promotes and profits from is detrimental to children. Meta also knows that children are facing a mental health crisis. According to Meta’s own studies, ‘82% of teens have felt at least one emotional issue in the past month. One in five has thought about suicide or self-injury,’” the suit alleges. “In Meta’s own words: ‘Teens blame Instagram for increases in the rates of anxiety and depression among teens.’”
The suit also claims Meta has not responded accordingly. “On multiple occasions, Meta has explicitly considered and explicitly rejected design changes after finding that those changes would decrease the danger of harms, but also decrease engagement — and therefore Meta’s profits.”
A spokesperson for Meta said the company does not have a statement to share at this time but pointed toward the legislative framework submitted to the National Telecommunications and Information Administration by Kevin Martin, the vice president of public policy at Meta, and Antigone Davis, the global head of safety at Meta.
The framework includes requiring app stores to obtain parental control for children under 16 when they try to download apps other than general online services like email and search, and requiring the industry to limit the personalization of advertisements for those 16 and under to age and location only. It also states that social media apps should offer tools parents can use on their children’s accounts if they are under 16.
Section 230 of the Communications Decency Act has historically prevented social media companies from being liable for posts on their platforms.
There’s an emerging legal argument that algorithms, targeted advertisements and design features should be treated differently than posts on social media platforms when it comes to liability. In the Twitter v. Taamneh decision, Justice Clarence Thomas rejected the argument that algorithms should tip the scales into creating liability. Still, the argument is novel and other judges may decide differently in the future.
But an October 2023 ruling by a California court said that while social media platforms are not considered “products” for product liability claims, they are not immune from being sued over alleged negligence. This could be an important precedent moving forward.
“One of our arguments focuses on the fact that TikTok’s algorithm itself is harmful because it figures out how to keep a user hooked. TikTok’s algorithm learns a user’s behavior and exploits that information,” Utah Attorney General Sean Reyes said. “Section 230 may provide immunity for third-party content, but it does not provide immunity for TikTok’s own conduct in using its exploitive algorithm.”
It could be as early as this spring when arguments begin in either one or both of these civil cases.
“Our office is currently defending the social media bills in litigation,” Reyes said. “If these laws are amended, we will continue to defend against any legal challenges in this important fight for the physical and mental well-being of Utah’s children.”
“We don’t have 40 years to study whether or not social media companies are causing 70% or 95% of the teen mental health crisis, we know there is a teen mental health crisis. Their own documents show that they know their social media apps, particularly their algorithms and other product features, can cause mental health problems or exacerbate existing problems, or both,” Margaret Busse, Utah Commerce executive director, said. “The ways in which these mental health issues have skyrocketed in the past decade are so alarming and are hardly coincidental.”
The national conversation
Though Utah is the pioneer in addressing social media harms, the rest of the nation has also started engaging in similar efforts. U.S. Surgeon General Vivek Murthy recently issued an advisory on social media and youth mental health which states “we cannot conclude social media is sufficiently safe for children and adolescents.”
The Social Media Victims Law Center has also emerged as a force trying to combat the harms caused by social media through civil litigation. Laura Marquez-Garrett, an attorney with the center, said that as these social media harm cases play out in court, discovery will pull back the curtain on how these companies operate.
While the center has not yet seen legal recourse for victims yet, she stressed that “these cases are still in their infancy.” As more cases play out, she said she hopes the legal system will be able to address harms caused by social media.
“It’s not necessarily that the entire internet is dangerous. But we are talking about for-profit companies that have made more money than any industry in the history of the world in a shorter amount of time,” Marquez-Garrett said, adding these companies are doing it “with no transparency and no regulation.”
No matter the twist or turns that happen on a national scale, Utah lawmakers have expressed their commitment to addressing social media harms through the avenues available to them.
“We won’t stand by while social media companies continue to exploit kids,” McKell said. “Social media companies know the harm they are inflicting on our youth’s mental health — and we’re not going to look away.”