“These are first-of-their-kind bills in the United States,” Utah Gov. Spencer Cox said in March at the signing of SB152, which required social media companies operating in Utah to age-verify users and obtain explicit parental consent for users under the age of 18 to open an account. “That’s huge that Utah is leading out in this effort,” he said.
The governor’s remarks about Utah’s leadership were more than true: they were an understatement. Since then, three other states — Texas, Louisiana and Arkansas — signed bills modeled after Utah’s work, and the Institute for Family Studies has heard from lawmakers around the country that they aim to follow suit in 2024. Utah’s legislative boldness has kicked off a revolution in how lawmakers fight to protect our kids from predatory social media platforms.
But Utah has not stopped there. In October, the governor’s office announced it was joining a group of 42 states, red and blue, in a lawsuit against Meta, the parent company of Facebook and Instagram, for developing techniques and adopting practices with the intent of addicting kids for profit. This was the second time Utah, under Cox’s leadership, sued a social media company (the previous being against TikTok, probably the worst app for kids); nonetheless, the suit’s bipartisan nature and its focus on how the design of a platform can undermine personal wellbeing is once again a watershed in confronting Big Tech on behalf of our children.
The state also passed legislation to keep underage users off pornography sites — which as a matter of practice turn a blind eye to the great mass of adolescents accessing their obscene and addictive wares — by once again requiring age verification. The law was upheld by a federal judge in a landmark decision in August, owing to the law’s design, which put enforcement into the hands of parents by granting them a right to sue porn companies for damages should they fail to properly vet their child.
These efforts are exemplary, but they leave untouched one of the principal culprits driving the mental health crisis among American adolescents, as well as the principal means by which they access pornography — the devices themselves, smartphones and tablets.
Devices and the app stores they host are virtually unregulated, especially for child safety, even though they are the most common way that minors access social media and stumble across pornography. Regulators would never grant this same impunity to toy manufacturers, food producers or the providers of numerous other products and services to kids. And yet we give Apple and Google, the two most dominant device manufacturers, the benefit of the doubt to market their products to kids, allowing them to occupy their attention for hours on end and overshadow their mental and social lives without any meaningful oversight whatsoever.
Apple and Google’s app stores and devices do not deserve this trust. As a recent policy brief, “Making Smartphones and App Stores Safe for Kids,” by the Institute for Family Studies and the Ethics and Public Policy Center, shows, many apps in the app stores have been found to be inappropriately rated for kids, claiming to be child-friendly when they are really for older audiences. They also advertise inappropriate apps to users that these companies should know are kids, and allow apps with sexualized and indecent imagery to be marketed to kids in other apps rated as age appropriate.
As for access to pornography, even Pornhub’s own research has shown that more than 80% of traffic to the site comes from handheld devices. Unsurprisingly, according to Common Sense Media, the average child encounters pornography at age 12, and in many cases far younger.
Engaged parents can only do so much to fend off the worst of smartphones when the app stores themselves are deceptive and when personal devices — so easy to slip into one’s pocket in a pinch — can be easily used by kids to access apps and sites their parents know nothing about.
The device manufacturers have shown little interest in self-reform. It’s no mystery why. They get a 30% commission on the sales of apps in the app stores and make a killing on advertising fees — so making their devices and app stores widely available to kids, a critical market, is in their interest. The more addictive the app, the better. If a child is addicted, his “time on device” is secured, meaning that he will continually generate data (which can be sold) and be primed for advertising access.
If social media platforms are the substance, device manufacturers are the dealers. Both must be held accountable.
What can a state like Utah do?
First, require age verification on the device level. It is critical to realize that Apple and Google already conduct age verification on their devices — Google standardizes this practice in the set-up process, and Apple conducts age verification when applicants sign up for an Apple credit card on their iPhone. What Google and Apple do not do is use this process for child safety. States looking for several options for how to facilitate verification can require it upon the purchase of a device or can require that age verification be conducted using a government-issued ID upon set up. (Notably, Apple and Google wallets already allow users to securely store their IDs).
Our second recommendation follows the first. Once age verification is complete, age-appropriate settings and parental controls should kick in automatically, on the device and in the app store, especially filters to block obscenity. This fix is a no-brainer, making it simple for parents to set up the device for child safety — which, currently, is uncharacteristically difficult, given the elegance and simplicity of these companies’ other products. It would also ensure that only age-appropriate content, according to the various app store ratings, is made available to underage users.
This doesn’t solve the tendency toward deceptiveness in the app stores themselves (a problem which will require other remedies perhaps secured by attorneys general as a condition of a future settlement). But it would still go a very long way to aligning underage users with child-friendly material.
Another important aspect of making devices safe-by-default is requiring parental consent for the download of new apps as well as the notification of parents with each download. But if lawmakers in Utah elect not to proceed with age verification, a second option is to require a default filter to block obscenity, not based on required age verification, but the age as determined through devices’ existing set-up processes. This is the method endorsed by Protect Young Eyes and the National Center on Sexual Exploitation, two organizations that have endorsed the new policy brief.
Age verification on the device, accompanied with automatic age-appropriate defaults, would add a double layer of protection for kids, who would then be guarded from accessing pernicious apps at the level of the operating system (the device) and also — given the preexistence of SB152 in Utah — on the platform level.
Meta argues that age verification should be done by the device manufacturers and not by the platforms. This is self-serving and too risky for kids. The obvious problem is that while, yes, most kids are accessing social media platforms and pornography through their devices, they can also surf their way there on a parent’s desktop or laptop.
Device-level regulation cannot be a means by which certain bad actors are empowered to shift blame to other bad actors — it is a means to draw the device manufacturers into the new regulatory reality for our kids, in which companies in Silicon Valley are required to keep their products from preying on children. Requiring device-level age verification on the smartphone or tablet, which would then be applied to various apps accessed through the device in the app store, would simplify the process and close the holes that kids sneak through to access objectionable platforms.
And finally, Utah should open up more litigation to hold device companies accountable by amending its so-called “little FTC act” — Utah Code Section 13-11-4 (1953) — to add “digital transactions” to its scope of jurisdiction.
States possess acts mirroring the federal FTC act (the Uniform Deceptive Trade Practices Act), which, among other things, protects consumers from companies engaging in misleading, fraudulent and abusive advertising. These are important means to protect children from companies that want to lure them into potentially harmful purchases.
In Utah’s little FTC act, the simple addition of the phrase “digital transactions” would help clarify the state’s authority and power to bring causes of action against device manufacturers for allowing abusive marketing of apps to kids. As Adam Candeub has written, such a simple amendment offers “a range of potential remedies, including actual damages, enhanced damages, injunctive or declaratory relief, attorneys’ fees, court costs, and rescission for unfair and deceptive practices committed in the conduct of trade or commerce.” Additionally, the law could be made even stronger by adding amendments that spell out explicitly that the laws prohibit app stores and apps from abusively marketing their goods to children.
Society once thought it wise to put the whole wide world into the hands of our kids. But with everything from pro-Hamas propaganda to pornography easily accessible on these devices, we have relearned the bitter lesson that there are monsters out there. And to give one’s child access to the whole world has turned out to be more than reckless — it is foolish.
We strongly encourage parents to resist giving their children smartphones until they are at least 16. But given the ubiquity of smartphones among children, the state also has a role in protecting the minds and hearts of the rising generation.
These three proposals, if properly implemented, would help make smartphones safer for kids. If Utah took these up, it would once again be leading a movement that would follow across the 50 states.
Michael Toscano is executive director of the Institute for Family Studies. Brad Wilcox is the Future of Freedom Fellow at the Institute for Family Studies. Elizabeth Self is outreach coordinator at the Institute for Family Studies.