Child sexual abuse is a major problem worldwide. But the direct abuse of children is not as lucrative as the digital dissemination of such material, and the online market for child sexual abuse material, more commonly known as child pornography, is immense and growing.
The effects on the children abused are profound — physically, mentally and emotionally. In addition, the images are often impossible to completely erase from the internet, ensuring continued trauma. And the market is truly transnational; a 2022 sting by New Zealand found a network of child-pornography sharers across 12 countries. It’s a form of virtual child sex trafficking.
The consumers of child pornography are typically unrepentant: The moderator of one such website, contacted by reporters, expressed the view that he was the true victim, saying that visitors to the site were perhaps “the most hated people on earth” and part of an “oppressed sexual minority.” He showed no remorse, casting the online community as “visionaries” whose crimes should be made legal.
Meanwhile, the creation and distribution of this material has evolved with new developments in technology. The advent of AI has enabled deepfake child pornography and new AI image generators have made it possible to create original images from text descriptions. The generative AI model Stable Diffusion had actually been trained on corpora that included child pornography.
Each nation-state must decide whether the distribution and possession of this material are to be viewed as a criminal acts, and if so, to what extent AI-generated images are also to be treated as child pornography.
Hearteningly, the U.S. and the U.K. have responded strongly, along with other nations. In the U.S., federal law prohibits the production, distribution, reception and possession of such images using any means or facility of interstate or foreign commerce — including, of course, the internet.
Simply possessing child sexual abuse material (or, CSAM) can lead to a U.S. federal prison term of up to 10 years, and its production entails a mandatory term of 15-30 years imprisonment. The Supreme Court has also held that child pornography is not covered by the First Amendment.
To its credit, the U.S. Department of Justice recently determined that AI-generated images are to be treated the same as those not generated by AI, arguing that AI could not generate such images unless actual images of children had been used in training the generator. As Deputy Attorney General Lisa Monaco has asserted, “Put simply, CSAM generated by AI is still CSAM, and we will hold accountable those who exploit AI to create obscene, abusive and increasingly photorealistic images of children.”
Earlier this month, a Wisconsin man was charged by the FBI for producing over 13,000 AI-generated images of child pornography. Given the volume of material created, Steven Anderegg is facing up to 70 years in a federal penitentiary. The U.S. is clearly treating Anderegg’s crimes as very serious in nature and scope.
And that is why recent legislative moves in Germany are so baffling to observers. Recently, the Bundestag agreed to significantly lower the penalties for possession of child pornography. Possession will garner a minimum of only three months’ imprisonment, while distribution will result in a minimum of six months’ imprisonment. These crimes will now be misdemeanors instead of felonies. It’s very hard not to see this as a decriminalization of possession and distribution of child pornography by Germany.
The ostensible rationale for the change was that juveniles were filming themselves and sharing the videos with peers, or parents and teachers had to download such material in order to present it to police, and therefore the net was spread too wide in terms of who was liable to be charged. But the Bundestag could have simply exempted such circumstances; instead, it issued a blanket lowering of penalties. The Christian Democrats, who opposed the bill, stated the matter well:
“Even if the increase in the penalty range under Section 184b of the Criminal Code in 2020 has led to practical problems in certain cases, a blanket reduction in the penalty range is the wrong solution. A change should be limited to the problem cases and solve them effectively. Scientific findings show that when the penalty range shifts downwards, the penalties imposed in practice also tend to be lower.”
A society that softens its stance on protecting children from sexual predators is on the wrong track. The immense and lasting harm done to children by such predators is clearly evident. The U.S. Department of Justice should be applauded for applying penalties that fit the severity of the crime. Let’s hope the EU will force Germany to rethink its position on the matter before AI gets even more sophisticated.
Valerie M. Hudson is a university distinguished professor at the Bush School of Government and Public Service at Texas A&M University and a Deseret News contributor. Her views are her own.