He allegedly used Stable Diffusion, a text-to-image generative AI model, to create “thousands of realistic images of prepubescent minors,” prosecutors said.
It’s the next logical step for the pearl clutchers and amounts to “thought crime.”
I seriously doubt they would create any more surveillance for that than there already is for real CSAM.
The geek squad worker could still report these people, and it would be the prosecution’s job to prove that they were acquired or created in an illegal way.
That would just make it harder to prosecute people for CSAM since they will all claim their material was just ai. That would just end up helping child abusers get away with it.
Possession itself isn’t the problem, the problem is how they’re produced.
I think the production of generated CSAM is unethical because it still involves photos of children without their consent
No, because that increases demand for child abuse. Those pictures are created by abuse of children, and having getting access to them encourages for child abuse to produce more content.
There is evidence to suggest that viewing csam increases child seeking behavior. So them viewing generated CSAM would most likely have the same if not a similar result. That would mean that even just having access to the materials would increase the likelihood of child abuse
The survey was self reported so the reality is probably higher than the 42% cited from the study
I likewise want a legal avenue for these people who would otherwise participate in child abuse to not abuse children.
The best legal avenue for non-offending pedophiles to take is for them to find a psychologist that can help them work through their desires. Not to give them a thing that will make them want to offend even more.
I seriously doubt they would create any more surveillance for that than there already is for real CSAM.
That would just make it harder to prosecute people for CSAM since they will all claim their material was just ai. That would just end up helping child abusers get away with it.
I think the production of generated CSAM is unethical because it still involves photos of children without their consent
There is evidence to suggest that viewing csam increases child seeking behavior. So them viewing generated CSAM would most likely have the same if not a similar result. That would mean that even just having access to the materials would increase the likelihood of child abuse
https://www.theguardian.com/global-development/2022/mar/01/online-sexual-abuse-viewers-contacting-children-directly-study
The survey was self reported so the reality is probably higher than the 42% cited from the study
The best legal avenue for non-offending pedophiles to take is for them to find a psychologist that can help them work through their desires. Not to give them a thing that will make them want to offend even more.