The only way to effectively end that is provision of a victimless alternative (AI images from a model not trained on CSAM, created by something not trained in any way on problem solving or agency) and making that alternative free and freely available while simultaneously coming down hard on real chomos and the people who pay them to molest children.
Hmmm. I never thought of that! But what if such images are alleged to contribute to sexual assault? Should those images be censored?
Such images do not contribute to sexual assault MORE than actual images whose production directly involves assault. I would argue that unlike actual CSAM images, artifacts and evidence of crimes better suited to existence in an evidence locker, and while we should consider distributing them complicity in that crime.
Images that could lead people to feeling ways they don't want to ought be tagged and filterable, but that's not censorship if I can access the images without being outed by others for having done so. Put the Bible "in that section" but if someone wants to check out bibles from the library or buy one, that's their right. Maybe it comes in a package that says "danger, religious dogma contained within that does not conform to reality", but cigarettes aren't "banned" or "censored" by such warnings.
Really, it comes down to giving people the ability to feel ways they want, to avoid in reasonable ways them having to feel ways they don't want, and to do so without making any involved party feel bad or violated in its creation. To that end, AI art allows an "out" for everyone, so long as it's put, much like I would see done for the bible, behind a curtain that one must discretely and privately step behind to view, after having shown that they are actually an adult beyond the age of majority.
There's something to be said about the collective failure of a society where such a high percentage of a species wants to do something bad to other individuals of the species, and has no alternative that is not bad, and so we just set up a trap for anyone too weak to step across the boundary rather than finding something not-bad to divert it with.
Censorship is "one route to the place some wish to go, it travels straight through evil-land, and whenever someone walks it we ambush them and beat the shit out of them... If we catch them."
of course we don't monitor that path super-well because it's unfortunately common for some latent desire to be in the destination even if walking the path is repugnant... As such we tend to have more pity for those that do than we ought!
AI and other forms of art have enabled us to build a different road to that destination, one that does not cut through evil-land. This enables is to simultaneously reach the destination and to shed all vestiges of pity for those who walk that other road through the land of evil.
It simultaneously makes a good option, and makes the bad option completely unacceptable.
Censorship of the victimless option is itself unconscionable, while failure to censor the path of obligate victimization is also unconscionable.