• Nyoka@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    In this case, yes. Visually indistinguishable from a photo is considered CSAM. We don’t need any new laws about AI to get these assholes. Revenge porn laws and federal CSAM statutes will do.

    • TheAnonymouseJoker@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      I hope someone plants AI “CSAM” on your computers or phones, and you get arrested for it without bail. Then we will see what you have to say.

      • Raphaël A. Costeau@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        If they can plant AI CSAM in my computer they can also plant “real” CSAM in my computer. Your point doesn’t make any sense.

    • TheAnonymouseJoker@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      Reporting and getting my comment removed for feeling the hypothetical threat of becoming a CSAM planting victim? Wow, I think I struck the chord with you. It makes sense, people like you never think through things before suggesting them. Such people should never get the tiniest sliver of power.

        • TheAnonymouseJoker@lemmy.ml
          link
          fedilink
          arrow-up
          0
          ·
          5 months ago

          This is not a “CSAM” problem, since there is no physical outcome. This is a defamation and libel problem, and should be treated as such. If I see nonsensical notions, I will call them out without fear.

          • ssj2marx@lemmy.ml
            link
            fedilink
            arrow-up
            0
            ·
            5 months ago

            Do you not consider photoshopping an actual person’s photos into porn abusive towards that person?

            • TheAnonymouseJoker@lemmy.ml
              link
              fedilink
              arrow-up
              0
              ·
              5 months ago

              I consider it as defamation and libel. Yes, it is faux porn, but ultimately the goal is to harass and defame the person.

      • papertowels@lemmy.one
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        5 months ago

        Nothing about your comment addressed why it should be treated differently if it’s ai-generated but visually indistinguishable.

        • TheAnonymouseJoker@lemmy.ml
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          5 months ago

          There is not yet AI that can do this. Also, is there real world harm happening? This is a problem of defamation and libel, not “CSAM”. Reducing problems to absurdity is lethal to liberty of citizens.

          All those who wanted AI so much, you will have the whole cake now. Fuck AI empowerment. I knew this would happen, but the people glazing AI would not stop. Enjoy this brainrot, and soon a flood of Sora AI generated 720p deep fake porn/gore/murder videos.

          • papertowels@lemmy.one
            link
            fedilink
            arrow-up
            0
            ·
            5 months ago

            Just passing through, no strong opinions on the matter nor is it something I wish to do deep dive research on.

            Just wanted to point out that your original comment was indeed just a threat that did nothing to address OPs argument.

            • TheAnonymouseJoker@lemmy.ml
              link
              fedilink
              arrow-up
              0
              ·
              edit-2
              5 months ago

              It was not a threat, but a hypothetical example to gauge the reaction of that reactionary baiter.

              The problem with claiming AI generated art as CSAM is that there is no possible way to create an objective definition of what “level” of realism is real and what is not. A drawing or imaginary creation is best left not defined as real in any capacity whatsoever. If it is drawn or digitally created, it is not real, period. Those people thinking of good uses of AI were too optimistic and failed to account for the extremely bad use cases that will spiral out of control as far as human society goes.

              Even though China is incredibly advanced and proactive on trying to control this AI deepfake issue, I do not trust any entity in any capacity on such a problem impossible to solve on a country or international scale.

              I just had a dejavu moment typing this comment, and I have no idea why.

              • Zoot@reddthat.com
                link
                fedilink
                arrow-up
                0
                ·
                5 months ago

                Dude, it depicts a child in a sexual way. Find some other way to defend Loli’s then trying to say “The terms aren’t right, really its just libel” fuck outta here. Child, depicted in a sexual way -> CSAM. Doesn’t matter if it was drawn, produced, or photographed.

                • TheAnonymouseJoker@lemmy.ml
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  5 months ago

                  So if I draw a stick figure with 2 circles, call it 8 years old, is it CSAM? Will I be arrested for it? Do you see how that dumb logic does not work too well?

                  • magi@lemmy.blahaj.zone
                    link
                    fedilink
                    arrow-up
                    0
                    ·
                    5 months ago

                    In what world does that justify creating PHOTOREALISTIC sexual imagery of a REAL child? You’re out of your mind, royally.

                  • ssj2marx@lemmy.ml
                    link
                    fedilink
                    arrow-up
                    0
                    ·
                    edit-2
                    5 months ago

                    Hot take: yes. All art exists in a social context, and if the social context of your art is “this is a child and they are sexualized” then your art should be considered CSAM. Doesn’t matter if it’s in an anime style, a photorealistic style, or if it’s a movie where the children are fully clothed for the duration but are sexualized by the director as in Cuties - CSAM, CSAM, CSAM.

                • magi@lemmy.blahaj.zone
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  5 months ago

                  It is very clear that they produce and/or consume said material and feel threatened by anyone calling it what it is