So if I draw a stick figure with 2 circles, call it 8 years old, is it CSAM? Will I be arrested for it? Do you see how that dumb logic does not work too well?
Hot take: yes. All art exists in a social context, and if the social context of your art is “this is a child and they are sexualized” then your art should be considered CSAM. Doesn’t matter if it’s in an anime style, a photorealistic style, or if it’s a movie where the children are fully clothed for the duration but are sexualized by the director as in Cuties - CSAM, CSAM, CSAM.
The problem with your argument is there cannot be developed a scale or spectrum to judge where the fake stops and real starts for drawings or AI generated media. And since they were not recorded with a camera in real world, they cannot be real, no matter what your emotional response to such a deplorable defamation act may be. It is libel of an extreme order.
Cuties was shot with a camera in real world. Do you see the difference between AI generated media and what Cuties was?
there cannot be developed a scale or spectrum to judge where the fake stops and real starts
Ah, but my definition didn’t at all rely on whether or not the images were “real” or “fake”, did it? An image is not merely an arrangement of pixels in a jpeg, you understand - an image has a social context that tells us what it is and why it was created. It doesn’t matter if there were real actors or not, if it’s an image of a child and it’s being sexualized, it should be considered CSAM.
And yes I understand that that will always be a subjective judgement with a grey area, but not every law needs to have a perfectly defined line where the legal becomes the illegal. A justice system should not be a computer program that simply runs the numbers and delivers an output.
An image is not merely an arrangement of pixels in a jpeg,
I am not one of those “it’s just pixels on a screen” people. But if it was not recorded in real world with a camera, it cannot be real.
Who will be the judge? If there is some automated AI created, who will be the one creating it? Will it be perfect? No. We will end up in the situation that Google caused to users, like doctors, married parents and legitimate people being labelled as pedophiles or CSAM users. It has already happened to me in this thread, and you also said it. The only accurate way to judge it will be a very large team of forensic experts on image/video media, which is not feasible for the amount of data social media generates.
not every law needs to have a perfectly defined line
And this is where the abuse by elites, politicians and establishment starts. Activists and dissidents can be easily jailed by CSAM being planted, which would in this case be as simple as AI pictures being temporary drive by downloads onto target’s devices.
The same people that should judge every criminal proceeding. Of course it’s not going to be perfect, but this is a case of not letting perfect be the enemy of good. Allowing generated or drawn images of sexualized children to exist has external costs to society in the form of normalizing the concept.
The argument that making generated or drawn CSAM illegal is bad because the feds might plant such images on an activist is incoherent. If you’re worried about that, why not worry that they’ll plant actual CSAM on your computer?
Have you considered the problem of doctors, married parents and other legitimate people being labelled as CSAM users and pedophiles? This has already happened, and they are not obligated to take the brunt of misjudgement of tools developed to judge such media. This is not a hypothetical scenario, and has already happened in real world, and has caused real world damage to people.
The argument of planted CSAM is not incoherent, but has also played out with many people. It is one of the favourite tools for elites and ruling politicians to use. I am less worried about it because such a law thankfully does not exist, that will misjudge the masses brutally for fictional media.
How many times can I say “social context” before you grok it? There’s a difference between a picture taken by a doctor for medical reasons and one taken by a pedo as CSAM. If doctors and parents are being nailed to the cross for totally legitimate images then that strikes me as evidence that the law is too rigid and needs more flexibility, not the other way around.
If a pedophile creates a hospital/clinic room setting and photographs a naked kid, will it be okay? Do you understand these problems impossible to solve just like that? Parents also take photos of their kids, and they do not take photos like a doctor would. They take photos in more casual settings than a clinic. Would parents be considered pedophiles? According to the way you propose to judge, yes.
You are basically implying that social defamation is what matters here, and the trauma caused to victim of such fictional media is a problem. However, this is exactly what anti-AI people like me were trying to warn against. And since these models are open source and in public hands, the cat is out of the bag. Stable diffusion models work on potato computers and take atmost 2-5 minutes to generate per photo, and 4chan has entire guides for uncensored models. This problem will be 100x worse in a couple years, and 1000x worse in the next 5 years. And infinitely worse in a decade. Nothing can be done about it. This is what AI revolution is. Future generations of kids are fucked thanks to AI.
The best thing one can do is protect their privacy, and photos from being out there. Nobody can win this battle, and even in the most dystopian hellhole with maximum surveillance, there will be gaps.
So if I draw a stick figure with 2 circles, call it 8 years old, is it CSAM? Will I be arrested for it? Do you see how that dumb logic does not work too well?
In what world does that justify creating PHOTOREALISTIC sexual imagery of a REAL child? You’re out of your mind, royally.
Hot take: yes. All art exists in a social context, and if the social context of your art is “this is a child and they are sexualized” then your art should be considered CSAM. Doesn’t matter if it’s in an anime style, a photorealistic style, or if it’s a movie where the children are fully clothed for the duration but are sexualized by the director as in Cuties - CSAM, CSAM, CSAM.
Glad that it will always remain a hot take.
The problem with your argument is there cannot be developed a scale or spectrum to judge where the fake stops and real starts for drawings or AI generated media. And since they were not recorded with a camera in real world, they cannot be real, no matter what your emotional response to such a deplorable defamation act may be. It is libel of an extreme order.
Cuties was shot with a camera in real world. Do you see the difference between AI generated media and what Cuties was?
Ah, but my definition didn’t at all rely on whether or not the images were “real” or “fake”, did it? An image is not merely an arrangement of pixels in a jpeg, you understand - an image has a social context that tells us what it is and why it was created. It doesn’t matter if there were real actors or not, if it’s an image of a child and it’s being sexualized, it should be considered CSAM.
And yes I understand that that will always be a subjective judgement with a grey area, but not every law needs to have a perfectly defined line where the legal becomes the illegal. A justice system should not be a computer program that simply runs the numbers and delivers an output.
I am not one of those “it’s just pixels on a screen” people. But if it was not recorded in real world with a camera, it cannot be real.
Who will be the judge? If there is some automated AI created, who will be the one creating it? Will it be perfect? No. We will end up in the situation that Google caused to users, like doctors, married parents and legitimate people being labelled as pedophiles or CSAM users. It has already happened to me in this thread, and you also said it. The only accurate way to judge it will be a very large team of forensic experts on image/video media, which is not feasible for the amount of data social media generates.
And this is where the abuse by elites, politicians and establishment starts. Activists and dissidents can be easily jailed by CSAM being planted, which would in this case be as simple as AI pictures being temporary drive by downloads onto target’s devices.
The same people that should judge every criminal proceeding. Of course it’s not going to be perfect, but this is a case of not letting perfect be the enemy of good. Allowing generated or drawn images of sexualized children to exist has external costs to society in the form of normalizing the concept.
The argument that making generated or drawn CSAM illegal is bad because the feds might plant such images on an activist is incoherent. If you’re worried about that, why not worry that they’ll plant actual CSAM on your computer?
Have you considered the problem of doctors, married parents and other legitimate people being labelled as CSAM users and pedophiles? This has already happened, and they are not obligated to take the brunt of misjudgement of tools developed to judge such media. This is not a hypothetical scenario, and has already happened in real world, and has caused real world damage to people.
The argument of planted CSAM is not incoherent, but has also played out with many people. It is one of the favourite tools for elites and ruling politicians to use. I am less worried about it because such a law thankfully does not exist, that will misjudge the masses brutally for fictional media.
How many times can I say “social context” before you grok it? There’s a difference between a picture taken by a doctor for medical reasons and one taken by a pedo as CSAM. If doctors and parents are being nailed to the cross for totally legitimate images then that strikes me as evidence that the law is too rigid and needs more flexibility, not the other way around.
If a pedophile creates a hospital/clinic room setting and photographs a naked kid, will it be okay? Do you understand these problems impossible to solve just like that? Parents also take photos of their kids, and they do not take photos like a doctor would. They take photos in more casual settings than a clinic. Would parents be considered pedophiles? According to the way you propose to judge, yes.
You are basically implying that social defamation is what matters here, and the trauma caused to victim of such fictional media is a problem. However, this is exactly what anti-AI people like me were trying to warn against. And since these models are open source and in public hands, the cat is out of the bag. Stable diffusion models work on potato computers and take atmost 2-5 minutes to generate per photo, and 4chan has entire guides for uncensored models. This problem will be 100x worse in a couple years, and 1000x worse in the next 5 years. And infinitely worse in a decade. Nothing can be done about it. This is what AI revolution is. Future generations of kids are fucked thanks to AI.
The best thing one can do is protect their privacy, and photos from being out there. Nobody can win this battle, and even in the most dystopian hellhole with maximum surveillance, there will be gaps.