- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
In 2023, more deepfake abuse videos were shared than in every other year in history combined, according to an analysis by independent researcher Genevieve Oh. What used to take skillful, tech-savvy experts hours to Photoshop can now be whipped up at a moment’s notice with the help of an app. Some deepfake websites even offer tutorials on how to create AI pornography.
What happens if we don’t get this under control? It will further blur the lines between what’s real and what’s not — as politics become more and more polarized. What will happen when voters can’t separate truth from lies? And what are the stakes? As we get closer to the presidential election, democracy itself could be at risk. And, as Ocasio-Cortez points out in our conversation, it’s about much more than imaginary images.
“It’s so important to me that people understand that this is not just a form of interpersonal violence, it’s not just about the harm that’s done to the victim,” she says about nonconsensual deepfake porn. She puts down her spoon and leans forward. “Because this technology threatens to do it at scale — this is about class subjugation. It’s a subjugation of entire people. And then when you do intersect that with abortion, when you do intersect that with debates over bodily autonomy, when you are able to actively subjugate all women in society on a scale of millions, at once digitally, it’s a direct connection [with] taking their rights away.”
You can’t treat it that way, because this is something that a complicit media is willing to share, and you cannot stop them from sharing it without going into major First Amendment violation territory.
Sure you can. Cp is something that complicit people are still willing to share. Generating and distributing fake non consenting porn of people doesn’t need to be covered by the first amendment. Decide as a society that it’s fucking gross and should be illegal and then treat violations of the laws created against it harshly.
That is not something that society has done in a long time. You are talking about something that was never legal in the first place vs. making something illegal that was already legal.
Was cp never legal? Are you sure we haven’t made things illegal that we’re previously legal? People have this weirdly defeatist view about regulating ai deep fakes that doesn’t seem based on anything solid.
No, it was never legal.
Please re-read my post and do not put words in my mouth.
I gave extremely solid reasoning. What have you said that is so solid?
Nah you haven’t really backed up with any solid reasoning. All laws have a date they were codified and enacted. Before that date the activities they covered were not yet illegal. Cp was at some point legal. Not long ago marital rape was perfectly legal. Now it’s not. Revenge porn laws are going into effect. You totally can take something awful that was legal and make it illegal. It might not be immediate, perfect, or without resistance but it can be done and has been done, even recently.
It was never legal because all pornography was illegal first. And unless you want to go back to pre-Christian Rome, that’s how it’s been for centuries.
I’m still waiting for your solid reasoning, because so far, your ‘reasoning’ has been ‘you can totally do it.’
I don’t think you are trying very hard to see beyond your defeatist position. The revenge porn example should land close enough to the mark for you to see that things that were legal can be made illegal. Even types of porn can be made illegal. So because it has literally been done before, yes you can totally do it.
Also yes the centuries ago interpretation is fine. Still shows taking something that was legal and making illegal.
I’m sorry, but “pornography was legal in the pre-Christian Roman Empire, therefore we can stop deepfake porn now” is your so-called solid reasoning?