The European Commission has established the Network for the Prevention of Child Sexual Abuse; an official expert group aimed at combating child sexual abuse and exploitation both online and offline.
For decades, sexual violence prevention practitioners have been calling for greater investment in preventing child sexual abuse. They’ve emphasized the importance of stopping harm before it happens, of supporting families and communities, and of changing the conditions that allow abuse to persist. Yet, one of the most persistent challenges facing preventionists is the lack of economic evidence to show that prevention is not only effective but it’s also worth the cost.
AI-generated child sexual abuse material (CSAM) carries unique harms. When generated from a photo of a clothed person, it can damage that person’s reputation and cause serious distress. When based on existing CSAM, it risks re-traumatizing victims. Even AI CSAM that seems purely synthetic may come from a model that was trained on real abusive material. Many experts also warn that viewing AI CSAM can normalize child abuse and increase the risk of contact abuse. There is the added risk that law enforcement may mistake AI CSAM for content involving a real, unidentified victim, leading to wasted time and resources spent trying to locate a child who does not exist.

