Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

    • argl@feddit.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      Can’t afford this much cheese today to find just the right slice for every bikini photo…

  • SabinStargem@lemmy.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 days ago

    Deepfakes might end up being the modern version of a bikini. In the olden days, people wore these to the beach. Having less was scandalous and moral decay. Yet, now we wear much less.

    Our grandchildren might simply not give a damn about their nudity, because it is assumed that everyone is deepfaking everyone.

    • atomicorange@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      These are all worn voluntarily. This issue isn’t about the equivalent of scandalously clad young girls, it’s like if girls were being involuntarily stripped of their clothing by their classmates. It’s not about modesty standards it’s about sexual abuse.

      • Gsus4@mander.xyz
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 days ago

        Unless it is used to pretend that it is a real video and circulated for denigration or blackmail, it is very much not at all like assault. And also, deepfakes do not have the special features hidden under your clothes, so it is possible to debunk those if you really have to.

      • SabinStargem@lemmy.today
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 days ago

        It can be both. The cornerstone of why nudity can be abused, is that society makes it shameful to be bare. If some generations from now that people can just shrug and not care, that is one less tool an abuser can use against people.

        In any case, I am of the mind that people of my generation might be doing their own version of the Satanic Panic, or the reaction against rap music. For better or worse, older people cannot relate to the younger.

  • FriendFatale@leminal.space
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 days ago

    anyone using any kind of AI either doesn’t know how consent works-- or they don’t care about it.

    a horrifying development in the intersection of technofascism and rape culture

      • AstaKask@lemmy.cafe
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 days ago

        AI models (unless you’re training your own) are usually trained on data it does not have a licence to use. The companies training these models are also notorious for ignoring robot.txt and other measures websites use to stop bots from trawling their data.

        Like in crypto, most people in AI are not nerds, just criminal scum.

  • vane@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 days ago

    Maybe let’s assume all digital images are fake and go back to painting. Wait… what if children start painting deepfakes ?

    • aceshigh@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 days ago

      To add to that. I live in a red area and since the election I’ve been cat called much more. And it’s weird too, cus I’m middle aged…. I thought I’d finally disappear…

    • youmaynotknow@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 days ago

      In my case, other kids would not have survived trying to pull off shit like this. So yeah, I’m also glad I’m not a kid anymore.

  • Daftydux@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 days ago

    Welp, if I had kids they would have one of those scramble suits like in a scanner darkly.

    It would of course be their choice to wear them but Id definitely look for ways to limit their time in areas with cameras present.

    • Entertainmeonly@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 days ago

      That’s just called the outside now. Assume you are on camera at all times the moment you step out the front door. To be safe in the surveillance we live in today, best act as though you are being recorded in your own home as well.

      • Vanilla_PuddinFudge@infosec.pub
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        3 days ago

        best act as though you are being recorded in your own home as well.

        If you don’t know, don’t try? Seems a bit defeatist.

        There’s also the matter of “you” the NPC and well… “You”.

        You can rest easy knowing Trump knows you’re at work, but not the contents of the monologue you gave on Palestine on a political XMPP chatroom.

      • Daftydux@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        3 days ago

        You can make areas safe from cameras. No, you cant make everywhere camera free but you can minimize your time in those areas. Im not saying its a good system it would just be adjusting to the times.

        If the floor was lava and all that…

  • Walk_blesseD@piefed.blahaj.zone
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 days ago

    Jfc the replies here are fucking rancid. Lemmy is full of sweaty middle aged blokes in tech who hate it when anyone tells them that grown men who pursue teenage girls who have just reached an arbitrary age are fucking creeps, so of course they’re here encouraging the next generation of misogynist scum by defending this shit, too.
    And men (pretend to) wonder why we distrust them.

    Ngl, I’m only leaving reply notifs on for this one to work on my blocklist.

  • some_guy@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 days ago

    For example, Louisiana mandates a minimum five-year jail sentence no matter the age of the perpetrator.

    That’s just on it’s face stupid. A thirteen year old boy is absolutely gonna wanna see girls in his age group naked. That’s not pedophilia. It’s wanting to see the girls he fantasizes about at school every day. Source: I was a thirteen year old boy.

    It shouldn’t be treated the same as when an adult man generates it; there should be nuance. I’m not saying it’s ok for a thirteen year old to generate said content: I’m saying tailor the punishment to fit the reality of the differences in motivations. Leave it to Louisiana to once again use a cudgel rather than sense.

    I’m so glad I went through puberty at a time when this kind of shit wasn’t available. The thirteen year old version of me would absolutely have got myself in a lot of trouble. And depending on what state I was in, seventeen year old me could have ended listed as a sex predetor for sending dick pics to my gf cause I produced child pornography. God, some states have stupid laws.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 days ago

      In general, even up here in woke-ville, punishments have gotten a lot more strict for kids. There’s a lot more involvement of police, courts, jail. As a parent it causes me a lot of anxiety - whatever happened to school being a “sandbox” where a kid can make mistakes without adult consequences, without ruining their lives? Did that ever exist?

      • BlackPenguins@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        2 days ago

        I can already picture that as an Onion headline:

        New York Renames State to ‘WokeVille’. NYC to follow.

      • jwmgregory@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 days ago

        it existed if society liked you enough.

        fascists just have a habit of tightening that belt smaller and smaller, is what’s going on.

    • Agent641@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 days ago

      Punishment for an adult man doing this: Prison

      Punishment for a 13 year old by doing this: Publish his browsing and search history in the school newsletter.

    • Lka1988@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      3 days ago

      As a father of teenage girls, I don’t necessarily disagree with this assessment, but I would personally see to it that anyone making sexual deepfakes of my daughters is equitably and thoroughly punished.

      • seralth@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 days ago

        There is a difference between ruining the life of a 13 year old boy for the rest of his life with no recourse and no expectations.

        Vs scaring the shit out of them and making them work their ass off doing an ass load of community service for a summer.

        • Lka1988@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 days ago

          ruining the life of a 13 year old boy for the rest of his life with no recourse

          And what about the life of the girl this boy would have ruined?

          This is not “boys will be boys” shit. Girls have killed themselves over this kind of thing (I have personal experience with suicidal teenage girls, both as a past friend and as a father).

          I don’t think it’s unreasonable to expect an equivalent punishment that has the potential to ruin his life.

          • DancingBear@midwest.social
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 days ago

            Fake pictures do not ruin your life… sorry…

            Our puritanical / 100% sex culture is the problem, not fake pictures…

          • youmaynotknow@lemmy.ml
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            3 days ago

            Parents are responsible for their kids. The punishment, with the full force of the law (and maybe something extra for good measure), should fall upon the parents, since they should have made sure their kids knew how despicable and illegal doing this is.

            Yeah, I agree, we shouldn’t ruin the boys life, we should ruins his whole family to many times the extent something like this ruins a teen girl’s life.

            • Lka1988@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              0
              ·
              3 days ago

              Teenagers are old enough to understand consequences.

              In fact, my neighborhood nearly burned down last week because a teenager, despite being told “no” and “stop” multiple times - including by neighbors - decided to light off fireworks on the mountainside right behind the neighborhood.

              Red arrow is my house. We were damn lucky the wind was blowing the right direction. If this had happened the day before, the neighborhood would be gone.

              • jsomae@lemmy.ml
                link
                fedilink
                English
                arrow-up
                0
                ·
                2 days ago

                some day I hope to be brave enough to post pictures of my house on the internet

            • some_guy@lemmy.sdf.org
              link
              fedilink
              English
              arrow-up
              0
              ·
              3 days ago

              Yeah, I agree, we shouldn’t ruin the boys life, we should ruins his whole family to many times the extent something like this ruins a teen girl’s life.

              You’re a fucking asshole. This isn’t like prosecuting parents who let a school shooter have access to guns. The interenet is everywhere. Parents are responsible for bringing up their children to be socially responsible. A thirteen year old kid is anything but responsible (I mean their mentality / maturity, I’m not giving them a pass).

              Go hang out with conservatives who want more policing. Over here, we’ll talk about social programs you fucking prick.

              • youmaynotknow@lemmy.ml
                link
                fedilink
                English
                arrow-up
                0
                ·
                3 days ago

                I am an asshole, that’s never been in question, and I fully own it. Having said that, no amount of “social programs” is going to have any effect if fucking parents don’t raise their kids right.

                I’m entirely against surveillance, except when it comes to parents and keeping a close eye on everything their kids watch, browse or otherwise access (evidently making it known to the kids that “I can see EVERYTHING you see and do”).

                So, yeah, hang the imbecile parents that should not have had kids in the first place because a fucking social program or school would raise them instead. Fuck off.

                • Lka1988@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  3 days ago

                  social program

                  And thanks to the assholes in Congress who just passed the Big Betrayal Bill, those are all going away.

          • Vinstaal0@feddit.nl
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 days ago

            It is not abnormal to see different punishment for people under the age of 18. Good education about sex and what sexual assault does with their victims (same with guns, drugs including alcohol etc).

            You can still course correct the behaviour of a 13 year old. There is also a difference between generating the porn and abusing it by sharing it etc.

            The girls should be helped and the boys should be punished, but mainly their behaviour needs to be correcte

      • some_guy@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 days ago

        Yes, absolutely. But with recognition that a thirteen year old kid isn’t a predator but a horny little kid. I’ll let others determine what that punishment is, but I don’t believe it’s prison. Community service maybe. Written apology. Stuff like that. Second offense, ok, we’re ratcheting up the punishment, but still not adult prison.

        • tomenzgg@midwest.social
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 days ago

          In a properly functioning world, this could easily be coupled with particular education on power dynamics and a lesson on consent, giving proper attention to why this might be more harmful to get than to him.

          Of course, – so long as we’re in this hypothetical world – you’d just have that kind of education be a part of sex ed. or the like for all students, to begin with, but, as we’re in this world and that’s Louisiana…

        • Lka1988@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 days ago

          I did say equitable punishment. Equivalent. Whatever.

          A written apology is a cop-out for the damage this behaviour leaves behind.

          Something tells me you don’t have teenage daughters.

          • some_guy@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 days ago

            No kids. That’s why I say others should write the punishments. A written apology wasn’t meant as the only punishment. It was in addition to community service and other stipulations.

  • danciestlobster@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 days ago

    I don’t understand fully how this technology works, but, if people are using it to create sexual content of underage individuals, doesn’t that mean the LLM would need to have been trained on sexual content of underage individuals? Seems like going after the company and whatever it’s source material is would be the obvious choice here

    • kayzeekayzee@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 days ago

      I agree with the other comments, but wanted to add how deepfakes work to show how simple they are, and how much less information they need than LLMs.

      Step 1: Basically you take a bunch of photos and videos of a specific person, and blur their faces out.

      Step 2: This is the hardest step, but still totally feasable for a decent home computer. You train a neural network to un-blur all the faces for that person. Now you have a neural net that’s really good at turning blurry faces into that particular person’s face.

      Step 3: Blur the faces in photos/videos of other people and apply your special neural network. It will turn all the blurry faces into the only face it knows how, often with shockingly realistic results.

      • gkpy@feddit.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        3 days ago

        Cheers for the explanation, had no idea that’s how it works.

        So it’s even worse than @danciestlobster@lemmy.zip thinks, the person creating the deep fake has to have access to CP then if they want to deepfake it!

        • swelter_spark@reddthat.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 days ago

          AI can generate images of things that don’t even exist. If it knows what porn looks like and what a child looks like, it can combine those concepts.

        • some_guy@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 days ago

          There are adults with bodies that resemble underage people that could be used to train models. Kitty Yung has a body that would qualify. You don’t necessarily need to use illegal material to train to get illegal output.

        • Vinstaal0@feddit.nl
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 days ago

          You can probably do it with adult material and replace those faces. It will most likely work on models specific trained like the person you selected.

          People have also put dots on people’s clothing to trick the brain into thinking their are naked, you can probably fill those dots in with the correct body parts if you have a good enough model.

    • General_Effort@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 days ago

      This is mostly about swapping faces. You take a video and a photo of someone’s face. Software can replace the face of someone in the video with that face. That’s been around for a decade or so. There are other ways of doing it.

      When the face belongs to an underage individual, and the video is pornographic…

      LLMs only do text.

    • lime!@feddit.nu
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      4 days ago

      not necessarily. image generation models work on a more fine-grained scale than that. they can seamlessly combine related concepts, like “photograph”+“person”+“small”+“pose” and generate plausible material due to the fact that all of those concepts have features in common.

      you can also use small add-on models trained on very little data (tens to hundreds of images, as compared to millions to billions for a full model) to “steer” the output of a model towards a particular style.

      you can make even a fully legal model output illegal data.

      all that being said, the baso dataset that most of the stable diffusion family of models started out with in 2021 is medical in nature so there could very well be bad shit in there. it’s like 12 billion images so it’s hard to check, and even back with stable diffusion 1.0 there was less than a single bit of data in the final model per image in the data.

    • cley_faye@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 days ago

      I’d rather these laws be against abusing and exploiting child, as well as against ruining their lives. Not only that would be more helpful, it would also work in this case, since actual likeness are involved.

      Alas, whether there’s a law against that specific use case or not, it is somewhat difficult to police what people do in their home, without a third party whistleblower. Making more, impossible to apply laws for this specific case does not seem that useful.

      • Vinstaal0@feddit.nl
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 days ago

        There is also a difference between somebody harassing somebody with nude pictures (either real or not) than somebody jerking off to them at home. It does become a problem when an adult masturbated to pictures of children, but children to children. Let’s be honest, they will do it anyway.

  • electric_nan@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 days ago

    My mama always told me, that if someone makes a deepfake of you, then you make a deepfake of them right back!

  • RememberTheApollo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    4 days ago

    I’m sure the laws will focus on protecting IP - specifically that of AI companies or megacorps, the famous and powerful, but not the small creators of content or the rabble negatively affected by AI abuse.

    The rest of us will have to suffer through presenting whatever damaging and humiliating video to a court. If you can even afford a lawyer to do so. Then be offered a judgement that probably won’t be paid or won’t cover the damage done by an image that will never be able to be erased from the internet. Those damages could include the suicide of young people bullied and humiliated by such deepfakes.