He allegedly used Stable Diffusion, a text-to-image generative AI model, to create “thousands of realistic images of prepubescent minors,” prosecutors said.
But it means that it was trained on people and on pizza. If it can produce CSAM, it means it had access to pictures of naked minors. Even if it wasn't in a sexual context.
Minors are people. It knows what clothed people of all ages look like. It also knows what naked adults look like. The whole point of AI is that it can fill in the gaps and create something it wasn't trained on. Naked + child is just a simple equation for it to solve
The whole point of those generative models that they are very good at blending different styles and concepts together to create coherent images. They're also really good at editing images to add or remove entire objects.
I think @deathbird@mander.xyz meant was the AI could be trained on what sex is and what children are at different points. Then a user request could put those two concepts together.
But as the replies I got show, there were multiple ways this could have got accomplished. All I know is AI needs to go to jail.
Likely yes, and even commercial models have an issue with CSAM leaking into their datasets. The scummiest of all of them likelyget one offline model, then add their collection of CSAM to it.