Godric@lemmy.world to Lemmy Shitpost@lemmy.world · 2 years agoAh, Yes! AI Will Surely Save Us All!lemmy.worldimagemessage-square82linkfedilinkarrow-up10arrow-down10
arrow-up10arrow-down1imageAh, Yes! AI Will Surely Save Us All!lemmy.worldGodric@lemmy.world to Lemmy Shitpost@lemmy.world · 2 years agomessage-square82linkfedilink
minus-squareDaxtron2@startrek.websitelinkfedilinkarrow-up0·2 years agoHow can text ever possibly be CSAM when there’s no child or sexual abuse involved?
minus-squareJimmyeatsausage@lemmy.worldlinkfedilinkarrow-up0·2 years agoI didn’t say anything about text?
minus-squareDaxtron2@startrek.websitelinkfedilinkarrow-up0·2 years agoWhat exactly do you think erotic roleplay means?
minus-squareweker01@feddit.delinkfedilinkarrow-up0·2 years agoText even completely fictional can be CSAM based on jurisdiction.
minus-squareDaxtron2@startrek.websitelinkfedilinkarrow-up0·2 years agoI’ve seen no evidence to that. There are cases tried under obscenity laws but CSAM has a pretty clear definition of being visual.
minus-squareweker01@feddit.delinkfedilinkarrow-up0·2 years agoInternationally? I know that in Germany there are cases.
How can text ever possibly be CSAM when there’s no child or sexual abuse involved?
I didn’t say anything about text?
What exactly do you think erotic roleplay means?
Text even completely fictional can be CSAM based on jurisdiction.
I’ve seen no evidence to that. There are cases tried under obscenity laws but CSAM has a pretty clear definition of being visual.
Internationally? I know that in Germany there are cases.