• Nyoka@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      In this case, yes. Visually indistinguishable from a photo is considered CSAM. We don’t need any new laws about AI to get these assholes. Revenge porn laws and federal CSAM statutes will do.

      • TheAnonymouseJoker@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        I hope someone plants AI “CSAM” on your computers or phones, and you get arrested for it without bail. Then we will see what you have to say.

        • Raphaël A. Costeau@lemmy.ml
          link
          fedilink
          arrow-up
          0
          ·
          5 months ago

          If they can plant AI CSAM in my computer they can also plant “real” CSAM in my computer. Your point doesn’t make any sense.

      • TheAnonymouseJoker@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        Reporting and getting my comment removed for feeling the hypothetical threat of becoming a CSAM planting victim? Wow, I think I struck the chord with you. It makes sense, people like you never think through things before suggesting them. Such people should never get the tiniest sliver of power.

        • papertowels@lemmy.one
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          5 months ago

          Nothing about your comment addressed why it should be treated differently if it’s ai-generated but visually indistinguishable.

          • TheAnonymouseJoker@lemmy.ml
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            5 months ago

            There is not yet AI that can do this. Also, is there real world harm happening? This is a problem of defamation and libel, not “CSAM”. Reducing problems to absurdity is lethal to liberty of citizens.

            All those who wanted AI so much, you will have the whole cake now. Fuck AI empowerment. I knew this would happen, but the people glazing AI would not stop. Enjoy this brainrot, and soon a flood of Sora AI generated 720p deep fake porn/gore/murder videos.

            • papertowels@lemmy.one
              link
              fedilink
              arrow-up
              0
              ·
              5 months ago

              Just passing through, no strong opinions on the matter nor is it something I wish to do deep dive research on.

              Just wanted to point out that your original comment was indeed just a threat that did nothing to address OPs argument.

              • TheAnonymouseJoker@lemmy.ml
                link
                fedilink
                arrow-up
                0
                ·
                edit-2
                5 months ago

                It was not a threat, but a hypothetical example to gauge the reaction of that reactionary baiter.

                The problem with claiming AI generated art as CSAM is that there is no possible way to create an objective definition of what “level” of realism is real and what is not. A drawing or imaginary creation is best left not defined as real in any capacity whatsoever. If it is drawn or digitally created, it is not real, period. Those people thinking of good uses of AI were too optimistic and failed to account for the extremely bad use cases that will spiral out of control as far as human society goes.

                Even though China is incredibly advanced and proactive on trying to control this AI deepfake issue, I do not trust any entity in any capacity on such a problem impossible to solve on a country or international scale.

                I just had a dejavu moment typing this comment, and I have no idea why.

                • Zoot@reddthat.com
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  5 months ago

                  Dude, it depicts a child in a sexual way. Find some other way to defend Loli’s then trying to say “The terms aren’t right, really its just libel” fuck outta here. Child, depicted in a sexual way -> CSAM. Doesn’t matter if it was drawn, produced, or photographed.

                  • magi@lemmy.blahaj.zone
                    link
                    fedilink
                    arrow-up
                    0
                    ·
                    5 months ago

                    It is very clear that they produce and/or consume said material and feel threatened by anyone calling it what it is

                  • TheAnonymouseJoker@lemmy.ml
                    link
                    fedilink
                    arrow-up
                    0
                    ·
                    5 months ago

                    So if I draw a stick figure with 2 circles, call it 8 years old, is it CSAM? Will I be arrested for it? Do you see how that dumb logic does not work too well?

          • TheAnonymouseJoker@lemmy.ml
            link
            fedilink
            arrow-up
            0
            ·
            5 months ago

            This is not a “CSAM” problem, since there is no physical outcome. This is a defamation and libel problem, and should be treated as such. If I see nonsensical notions, I will call them out without fear.

            • ssj2marx@lemmy.ml
              link
              fedilink
              arrow-up
              0
              ·
              5 months ago

              Do you not consider photoshopping an actual person’s photos into porn abusive towards that person?

              • TheAnonymouseJoker@lemmy.ml
                link
                fedilink
                arrow-up
                0
                ·
                5 months ago

                I consider it as defamation and libel. Yes, it is faux porn, but ultimately the goal is to harass and defame the person.

    • KillingTimeItself@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      5 months ago

      i believe in the US for all intents and purposes, it is, especially if it was sourced from a minor, because you don’t really have an argument against that one.

      • HauntedCupcake@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        5 months ago

        I’m not sure where you’re going with that? I would argue that yes, it is. As it’s sexual material of a child, with that child’s face on it, explicitly made for the purpose of defaming her. So I would say it sexually abused a child.

        But you could also be taking the stance of “AI trains on adult porn, and is mearly recreating child porn. No child was actually harmed during the process.” Which as I’ve said above, I disagree with, especially in this particular circumstance.

        Apologies if it’s just my reading comprehension being shit

        • Todd Bonzalez@lemm.ee
          link
          fedilink
          arrow-up
          0
          ·
          5 months ago

          So you don’t think that nudifying pics of kids is abusive?

          Says something about you I think…

          • TheAnonymouseJoker@lemmy.ml
            link
            fedilink
            arrow-up
            0
            ·
            5 months ago

            Are drawings real? Says a lot about you, who would rather the liberty of masses be stomped and murdered rather than drawings exist.

              • TheAnonymouseJoker@lemmy.ml
                link
                fedilink
                arrow-up
                0
                ·
                5 months ago

                I think I have been attacked far too much here already. These nasty people are labelling me as pedophilia supporters. I would suggest capital punishment for pedophiles and atleast a non bailable offence law for such defamation actors like the one in post article and these internet creatures that go around labelling people falsely.

                • Zoot@reddthat.com
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  5 months ago

                  Then don’t defend them? You’re trying to tell everyone that what is literally above in an article, about a child who had PHOTOREALISTIC pictures made of her, that it isn’t CSAM.

                  It is. Deleting everyone’s comments who disagree with you will not change that, and if anything, WILL make you seem even more like the bad guy.

            • Todd Bonzalez@lemm.ee
              link
              fedilink
              arrow-up
              0
              ·
              5 months ago

              drawings

              Nobody said anything about drawings, but interesting default argument… Thanks for telling the class that you’re a lolicon pedo.

              the liberty of masses be stomped and murdered

              Nobody said that anyone should be stomped and murdered, so calm down, lmao. We’re just saying that child porn producers, consumers, and apologists are vile, disgusting perverts who should be held accountable for their crimes against children.

              • TheAnonymouseJoker@lemmy.ml
                link
                fedilink
                arrow-up
                0
                ·
                5 months ago

                Thanks for telling the class that you’re a lolicon pedo.

                I knew you are one of those reactionary false accusers. I highly doubt anyone takes you seriously in real life, because of this bratty behaviour. You are making the kids unsafe by making a reactionary bait out of this serious issue, and it is damaging more than the drawings you are posturing to be after.

                • magi@lemmy.blahaj.zone
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  5 months ago

                  They’re making them unsafe? You and your bullshit are making them unsafe. Every comment you post reeks of your true character. Go get help.

                  • TheAnonymouseJoker@lemmy.ml
                    link
                    fedilink
                    arrow-up
                    0
                    ·
                    5 months ago

                    You are projecting your need for professional therapy. You are so blind from anger, you are a complete utter failure at maintaining composure and thinking through things with a calm, composed mind.

                    I do not need to clarify my position. It is clear if you have a clear mind, which you do not. Just to repeat, AI drawings are not a CSAM problem, but a lethal weapon for defamation and libel.

                  • Todd Bonzalez@lemm.ee
                    link
                    fedilink
                    arrow-up
                    0
                    ·
                    5 months ago

                    I’m making kids unsafe by…

                    checks notes

                    …being firmly and unwaveringly against the sexual exploitation of children.

                    I really can’t stress enough that this was an actual 15 year old girl who was pornified with AI. This isn’t some “erotic drawings” argument, the end results were photorealistic nudes with her unmodified face. This isn’t some completely AI generated likeness. Pictures of her from social media were exploited to remove her clothes and fill in the gaps with models trained on porn. It was nonconsensual pornography of a kid.

                    Anyone who read this story, and feels the need to defend what was done to this girl is a fucking monster.

                    I can’t believe that the person defending sex crimes of this magnitude is a fucking mod.

        • NotMyOldRedditName@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          5 months ago

          It’s actually not clear that viewing material leads that person to causing in person abuse

          Providing non harmful ways to access the content may lead to less abuse as the content they seek no longer comes from abuse, reducing demand for abusive content.

          That being said, this instance isn’t completely fabricated and given its further release is harmful as it it involves a real person and will have emotional impact.

            • NotMyOldRedditName@lemmy.world
              link
              fedilink
              arrow-up
              0
              ·
              edit-2
              5 months ago

              There has been yes, but it doesn’t mean it’s the right ruling law. The law varies on that by jurisdiction as well because it is a murky area.

              Edit: in the USA it might not even be illegal unless there was intent to distribute

              By the statute’s own terms, the law does not make all fictional child pornography illegal, only that found to be obscene or lacking in serious value. The mere possession of said images is not a violation of the law unless it can be proven that they were transmitted through a common carrier, such as the mail or the Internet, transported across state lines, or of an amount that showed intent to distribute.[

              So local AI generating fictional material that is not distributed may be okay federally in the USA.

              • delirious_owl@discuss.online
                link
                fedilink
                arrow-up
                0
                ·
                5 months ago

                Serious value? How does one legally argue that their AI-generated child porn stash has “serious value” so they they don’t get incarcerated.

                Laws are weird.

                • NotMyOldRedditName@lemmy.world
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  edit-2
                  5 months ago

                  Have the AI try to recreate existing CP already deemed to have serious value and then have all the prompts/variations leading up to the closest match as part of an exhibit.

                  Edit: I should add, don’t try this at home, they’ll still probably say it has no value and throw you in jail.

        • Todd Bonzalez@lemm.ee
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          5 months ago

          That’s one definition, sure.

          Now answer the very simple question I asked about whether or not child porn is abusive.

    • Majestic@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      5 months ago

      It should be considered illegal if it was used to harm/sexually abuse a child which in this case it was.

      Whether it should be classed as CSAM or something separate, I tend to think probably something separate as a revenge porn type law that still allows for distinguishing between this and say a girl whose uncle groomed and sexually abused her while filming it as while this is awful it can (and often does seem) be the product of foolish youth rather than the offender and those involved all being very sick, dangerous, and actually violent offending adult pedophiles victimizing children.

      Consider the following:

      1. Underage girl takes a picture of her own genitals, unfortunately classified as the unhelpful and harmful term “child porn” and she can be charged and registered as a sex offender but it’s not CSAM and -shouldn’t- be considered illegal material or a crime (though it is because the west has a vile fixation on puritanism which hurts survivors of childhood sexual trauma as well as adults).

      2. Underage girl takes a picture of her genitals and sends it to her boyfriend, again /shouldn’t/ be CSAM (unfortunately may be charged similarly), she consented and we can assume there wasn’t any unreasonable level of coercion. What it is unfortunately is bound by certain notions of puritanism that are very American.

      3. From 2, boyfriend shares it with other boys, now it’s potentially CSAM or at the least revenge porn of a child as she didn’t consent and it could be used to harm her but punishment has to be modulated with the fact the offender is likely a child himself and not fully able to comprehend his actions.

      4. Underage boy cuts out photo of underage girl he likes, only her face and head, glues it atop a picture of a naked porn actress, maybe a petite one and uses it for his own purposes in private. Not something I think should be classed as CSAM.

      5. Underage boy uses AI to do the same as above but more believably, again I think it’s kind of creepy but if he keeps it to himself and doesn’t show anyone or spread it around it’s just youthful weirdness though really he probably shouldn’t have easy access to those tools.

      6. Underage boy uses AI to do same as 4-5 but this time he spread it around, defaming the girl, she/her friends find out, people say mean things about her, she has to go to school with a bunch of people who are looking and pleasuring themselves to fake but realistic images of herself against her consent which is violating and makes one feel unsafe. Worse probably being bullied for it, mean things, called the s-word, etc.

      Kids are weird and do dumb things though unfortunately boys especially in our culture have a propensity to do things that hurt girls far more than the inverse to the point it’s not even really worth talking about girls being creepy or sexually abusive towards peer-aged boys in adolescence and young adulthood. To address this though you need to address patriarchy and misogyny on a cultural level, teach boys empathy and respect for girls and women and frankly do away with all this abusive pornography that’s super prevalent and popular which encourages and perpetuates abusive actions and mentalities towards women and girls, this will never happen in the US however because it’s structurally opposed to being able to do such a thing. Also couldn’t hurt to peel back the stigma and shame around sexuality and nudity in the US which stems from its reactionary Christian culture but again I don’t think that will ever happen in the US as it exists, not this century anyways.

      Obviously not getting into adults here as that doesn’t need to be discussed, it’s wrong plain and simple.

      Bottom line I think is companies need to be strongly compelled to quickly remove revenge-porn type stuff (regardless of the age of the victim though children can’t deal with this kind of thing as well as adults so the risk of suicide or other self-harm is much higher so it should be treated as higher priority) which this definitely is. It’s abusive and unacceptable and they should fear the credit card companies coming down on them hard and destroying them if they don’t aggressively remove it and ban it and report those sharing it. It should be driven off the clear-web once reported, there should be an image-hash data-set like that used for CSAM (but separate) for such things and major services should use it to stop the spread.

      • deltapi@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        I think it’s best to not defend kiddie porn, unless you have a republican senator in your pocket.

        • Majestic@lemmy.ml
          link
          fedilink
          arrow-up
          0
          ·
          5 months ago

          Did you reply to the wrong person or do you just have reading comprehension issues?