vastly expands the pool of potential victims
I’m not brave enough at the moment to say it isn’t some kind of crime, but creating such images (as opposed to spamming them everywhere, using them for blackmail, or whatever) doesn’t seem to be a crime that involves any victims.
I’m brave enough to say what I am sure some people are thinking.
If a pedophile can have access to a machine that generates endless child porn for them, completely cutting off the market for the “real thing”, then maybe that’s a step in a positive direction. Very far from perfect but better than the status quo.
The ideal ultimate solution is to develop a treatment that pedophiles can use to just stop being pedophiles entirely. I bet most pedophiles would jump on such a thing. But until that magical day maybe let’s explore options that reduce the harm done to actually real children in the immediate term.
Some psychologists agree with you. Others say it would only make the problem worse, making them want to escalate. Definitely one that I’m letting the professionals debate on and I’ll go with their opinion
My bigger concern is the normalization of and exposure to those ideas and concepts (sexualization of children). That’s also why I dislike loli/shota media, despite it being fictional.
That said, I still think it’s a much better alternative to CSAM and especially to actually harming a child for those who have those desires due to trauma or mental illness. Though I’m not sure if easy, open access is entirely safe, either.
My bigger concern is the normalization of and exposure to those ideas and concepts
The same concern has been behind attempts to restrict/ban violent video games, and films before that, and books before that. Despite generations of trying, I don’t think a causal link has ever been established.