Taylor Swift is living every woman’s AI porn nightmare — Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.::Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.

  • Socsa@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    This shit is gross in general but she absolutely markets herself as a sex icon. I guarantee you she’s not that upset about more free publicity.

      • grayman@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        Read up on her. She 100% portrays a product and sells that product. She knows what she’s doing.

          • grayman@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            She’s literally said that in interviews. She’s super smart. She knows her persona is a product. That’s not a jab. Every famous person that has a personal brand does this. Her brand just happens to be called the same as her name.

            • books@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              9 months ago

              No I get that, but her personal or brand is not her getting fucked. Just because she is attractive and makes money playing music and shit doesn’t mean that her brand is sex appeal

              • grayman@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                9 months ago

                I didn’t say anything about porn. But if you think she doesn’t understand and leverage sex appeal… I just don’t know what to tell you.

                • books@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  9 months ago

                  It sort of sounds like your argument is 'she was asking for it" which I don’t necessarily agree with.

  • FenrirIII@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    Went to Bing and found “Taylor swift ai pictures” as a top search. LOTS of images of her being railed by Sesame Street characters

  • Neil@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    I’m not saying she shouldn’t have complained about this. She has every right to, but complaining about it definitely made the problem a lot worse.

      • Wytch@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        This is something negative that can impact her not something negative she did that she’d like us to forget.

        She oughta make a stink and put a spotlight on it if it’s gonna hurt her image.

      • UnderpantsWeevil@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        Except she’s the most famous woman in the country, a well-established sex symbol, and already the subject of enumerable erotic fantasies and fictions.

        Its the same problem as “The Fappening” from forever ago. The fact that this exists is its own fuel and whether she chooses to acknowledge it or not is a moot point. Someone is going to talk about it and the news will spread.

    • LeroyJenkins@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      she drew so much attention to it though that there are more news stories than actual images at this point. if you look for the images, you’re gonna have to go through pages and pages of news articles about it at this point. not sure if it was intentional, but kinda worked…

    • SpaceCowboy@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      This a problem that doesn’t just affect her though. Not discussing the problem means it doesn’t get worse for her, but does continue to happen to other people.

      Discussing it means it gets worse for her but there could be potentially be solutions found. Solutions that would help her and other people affected.

      Worst case scenario is no solution is found, but the people making AI porn make more Taylor Swift AI porn which results in less resources being devoted towards making AI porn of other people. This makes things worse for Swift but better for other people.

      TLDR; Taylor Swift is a saint and is operating a level that us petty sinners can’t comprehend.

    • dlok@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      This is probably a good thing even if real nudes leaked nobody would know if it’s real

  • nyan@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    9 months ago

    Fake celebrity porn has existed since before photography, in the form of drawings and impersonators. At this point, if you’re even somewhat young and good-looking (and sometimes even if you’re not), the fake porn should be expected as part of the price you pay for fame. It isn’t as though the sort of person who gets off on this cares whether the pictures are real or not—they just need them to be close enough that they can fool themselves.

    Is it right? No, but it’s the way the world is, because humans suck.

    • Ook the Librarian@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      Humans are horrible, but a main-stream social media platform should not be a celebration of it. People need to demand change and then leave if ignored. I seem to hear people demanding change. The next step has more impetus.

    • BreakDecks@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      Honestly, the way I look at it is that the real offense is publishing.

      While still creepy, it would be hard to condemn someone for making fakes for personal consumption. Making an AI fake is the high-tech equivalent of gluing a cutout of your crush’s face onto a playboy centerfold. It’s hard to want to prohibit people from pretending.

      But posting those fakes online is the high-tech, scaled-up version of xeroxing the playboy centerfold with your crush’s face on it, and taping up copies all over town for everyone to see.

      Obviously, there’s a clear line people should not cross, but it’s clear that without laws to deter it, AI fakes are just going to circulate freely.

      • EatATaco@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        AI fake is the high-tech equivalent of gluing a cutout of your crush’s face onto a playboy centerfold.

        At first I read that as “cousin’s face” and I was like “bru, that’s oddly specific.” Lol

      • Grimy@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        Yup, it’s all the more frustrating when you take into account that social media sites do have the capability to know if an image is NSFW, and if it matches the face of a celebrity. Knowing Taylors fan base, they are probably quickly reported.

        It’s mainly twitter as well, and it’s clear they are letting this go on to drum up controversy.

  • BeefPiano@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    I wonder if this winds up with revenge porn no longer being a thing? Like, if someone leaks nudes of me I can just say it’s a deepfake?

    Probably a lot of pain for women from mouth breathers before we get there from here .

    • TwilightVulpine@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      Why would it make revenge porn less of a thing? Why are so many people here convinced that as long people say it’s “fake” it’s not going to negatively affect them?

      The mouth breathers will never go away. They might even use the excuse the other way around, that because someone could say just about everything is fake, then it might be real and the victim might be lying. Remember that blurry pictures of bigfoot were enough to fool a lot of people.

      Hell, even others believe it is fake, wouldn’t it still be humilliating?

      • fine_sandy_bottom@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        The default assumption will be that a video is fake. In the very near future you will be able to say “voice assistant thing show me a video of that cute girl from the cafe today getting double teamed by robocop and an ewok wearing a tu-tu”. It will be so trivial to create this stuff that the question will be “why were you watching a naughty video of me” rather than “omg I can’t believe this naughty video of me exists”.

      • Æther@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        I think you’re underestimating the potential effects of an entire society starting to distrust pictures/video. Yeah a blurry Bigfoot fooled an entire generation, but nowadays most people you talk to will say it’s doctored. Scale that up to a point where literally anyone can make completely realistic pics/vids of anything in their imagination, and have it be indistinguishable from real life? I think there’s a pretty good chance that “nope, that’s a fake picture of me” will be a believable, no question response to just about anything. It’s a problem

        • TwilightVulpine@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          There are still people to believe in Bigfoot and UFOs, there’s still people falling for hoaxes every day. To the extent that distrust is spreading, it’s not manifested as widespread reasonable skepticism but the tendency to double down on what people already believe. There are more flat earthers today than there were decades ago.

          We are heading to a point that if anyone says deepfake porn is fake, regardless of reasons and arguments, people might just think it’s real just because they feel like it might be. At this point, this isn’t even a new situation. Just like people skip reputable scientific and journalistic sources in favor of random blogs that validate what they already believe, they will treat images, deepfaked or not, much in the same way.

          So, at best, some people might believe the victim regardless, but some won’t no matter what is said, and they will treat them as if those images are real.

          • daltotron@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            This strikes me as correct, it’s kind of more complicated than just the blanket statement of “oh, everyone will have too calloused of a mind to believe anything ever again”. People will just try to intuit truth from surrounding context in a vacuum, much like how they do with our current every day reality where I’m really just a brain in a vat or whatever.

        • eatthecake@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          I hope someone sends your mom a deepfake of you being dismembered with a rusty saw. I’m sure the horror will fade with time.

          • Æther@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            What a horrible thing to wish on a random person on the internet. Maybe take a break on being so reactionary, jesus

    • thantik@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      This has already been a thing in courts with people saying that audio of them was generated using AI. It’s here to stay, and almost nothing is going to be ‘real’ anymore unless you’ve seen it directly first-hand.

    • Snot Flickerman@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      9 months ago

      I mean, not much happened to protect women after The Fappening, and that happened to boatloads of famous women with lots of money, too.

      Arguably, not any billionaires, so we’ll see I guess.

    • Cheskaz@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      Australia’s federal legislation making non-consensual sharing of intimate images an offense includes doctored or generated images because that’s still extremely harmful to the victim and their reputation.

  • GilgameshCatBeard@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    Bet that whoever is doing it is either a hardcore neckbeard simp, or a butthurt conservative. They’re similar, but not the same.

    • GilgameshCatBeard@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      So you’re saying that the more wealth a person has, the more they deserve crimes against them? Come one know kid. Do you really want to think this way?

      • MJKee9@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        9 months ago

        That’s not their point and you know it. Get your bad faith debating tactics out of here.

        She isn’t living “every woman’s nightmare” because a woman without the wealth and influence Taylor has might actually suffer significant consequences. For Taylor, it’s just a weird Tuesday. For an average small town lady, it might mean loss of a job, loss of mate, estrangement from family and friends… That’s a nightmare.

          • MJKee9@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            You just keep shifting your argument to create some sort of sympathy. I guess. No one says a rich person isn’t a victim. The point is is being a victim as a wealthy and influential woman like Taylor is a lot different than being a victim in a working class context. If you disagree with that, then you’re either being intellectually dishonest or living in a dream world.

            Even the law agrees. It’s a lot harder as a celebrity to win a defamation lawsuit than it is being a normal person. You typically have to show actual malice. Frankly, that’s the legal standard that would probably apply to any lawsuit involving the deep fakes anyway.

          • Tangent5280@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            That is exactly it. She will suffer less compared to someone else this might have happened to, an dif you define victimhood on a spectrum, she’s less victim than Housewife Community leader preschool teacher Margaret from Montana.

            • GilgameshCatBeard@lemmy.ca
              link
              fedilink
              English
              arrow-up
              0
              ·
              9 months ago

              Gross dude. Very gross. Blocking you now as someone who thinks the wealthy can’t be victimized can’t possibly have anything of value to contribute.

              Do better.

              • stephen01king@lemmy.zip
                link
                fedilink
                English
                arrow-up
                0
                ·
                9 months ago

                The guy said less victimized and you conclude he meant cannot be victimized. Can you be any more stupid?

  • EatATaco@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    God what a garbage article:

    On X—which used to be called Twitter before it was bought by billionaire edgelord Elon Musk

    I mean, really? The guy makes my skin crawl, but what a hypocritically edgy comment to put into an article.

    And then zero comment from Taylor Swift in it at all. She is basically just speaking for her. Not only that, but she anoints herself spokesperson for all women…while also pretty conspicuously ignoring that men can be victims of this too.

    Don’t get me wrong, I’m not defending non consensual ai porn in the least, and I assume the author and I are mostly in agreement about the need for something to be done about it.

    But it’s trashy politically charged and biased articles like this that make people take sides on things like this. Imo, the author is contributing to the problems of society she probably wants to fix.

    • Powerpoint@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      I disagree. To pretend nothing is wrong is worse. The author was accurate in their description here.

      • EatATaco@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        This is the second poster here who can’t seem to understand that there is a whole world of things between “pretending nothing is wrong” and acting like a child by calling people “edge lord.”

        Last time I checked, on my front page, there was an article from the NY times about how x is spreading misinformation and musk seems to be part of it. yet they managed to point out this problem without using the term edge lord. Is this shocking to you?

    • hedgehog@ttrpg.network
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      On X—which used to be called Twitter before it was bought by billionaire edgelord Elon Musk

      I mean, really? The guy makes my skin crawl, but what a hypocritically edgy comment to put into an article.

      How is that comment hypocritical?

      • aesthelete@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        9 months ago

        Had big “people calling people edgelords are the real edgelords” type vibes to it, which I’ll file in the circular file right next to its cousin “people calling people racists are the real racists”.

        Edit: A couple of posts down the dude almost says that quote verbatim.

    • Psythik@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      On the contrary, I find it more ridiculous when news media pretends like nothing is wrong over at Twitter HQ. I wish more journalists would call Musk out like this every time they’re forced to mention Twitter.

      • EatATaco@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        Can you really see nothing other than “pretending nothing is wrong” and “calling musk an edge lord?”

        I see the media calling out the faults regularly regularly without needing to act like …well, an edge lord.

        • Psythik@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          Professionalism was thrown out the window the moment orange man became president. The Republicans play dirty, so everyone else has to as well, or else they’ll walk all over us. Taking the high ground is a dead concept.

          • EatATaco@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 months ago

            I strongly disagree, but this is completely unrelated to what I said.

            • Psythik@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              9 months ago

              I have ADHD so I forgot what we were talking about even before I started commenting.

    • jivandabeast@lemmy.browntown.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      9 months ago

      hypocritically edgy comment to put into an article.

      Its vice, their whole brand is edgy. Calling Elon an edgelord is very on brand for them.

      pretty conspicuously ignoring that men can be victims of this too.

      Sure, but women are disproportionately affected by this. You’re making the “all lives matter” argument of AI porn

      make people take sides on things like this

      People should be taking sides on this.

      Just seems like you wanna get mad for no reason? I read the article, and it doesn’t come across nearly as bad as you would lead anyone to believe. This article is about deepfake pornography as a whole, and how is can (and more importantly HAS) affected women, including minors. Sure it would have been nice to have a comment from Taylor, but i really don’t think it was necessary.

      • EatATaco@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 months ago

        Its vice, their whole brand is edgy. Calling Elon an edgelord is very on brand for them.

        I’ve come across this source before and don’t recall being so turned off by the tone. If this is on brand for them, then my criticism is not limited to the author.

        Sure, but women are disproportionately affected by this. You’re making the “all lives matter” argument of AI porn

        You have a point, but I disagree. Black lives matter is effectively saying that black lives currently don’t matter (mainly when it comes to policing). All lives matter is being dismissive of that claim because no one really believes that white lives don’t matter to police. Pointing to the fact that there are male victims too is not dismissive of the fact that women are the primary victims of this. It’s almost the opposite and ignoring males is being dismissive of victims.

        People should be taking sides on this.

        Sorry, wasn’t clear on that point. What I was saying here is this will make people take sides based on their politics rather than on the merits of whether it’s wrong in and of itself.

        i really don’t think it was necessary.

        Neither was her speaking for swift, nor all of women kind, nor only making it about women, nor calling musk an edge lord. You seem to be making the same argument as me.

  • books@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    I feel like I live on the Internet and I never see this shit. Either it doesn’t exist or I exist on a completely different plane of the net.

    • GBU_28@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      You ever somehow get invited to a party you’d usually never be at? With a crowd you.never ever see? This is that.

    • schnurrito@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      On the Internet, censorship happens not by having too little information, but too much information in which it is difficult to find what you want.

      We all have only so much time to spend on the Internet and so necessarily get a filtered experience of everything that happens on the Internet.

        • schnurrito@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 months ago

          No, that is not what I’m saying, mostly because I don’t think it is true. I’m saying that nowadays there is nearly all kinds of information one can think of somewhere out there on the Internet; but if it is only in relatively obscure places and you don’t know where to look for it, then it is still de facto censored by having too much other information out there.

  • Stanwich@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    WHAT?? DISGUSTING! WHERE WOULD THESE JERKS PUT THIS ? WHAT SPECIFIC WEBSITE DO I NEED TO BOYCOTT?

    • paddirn@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      Google Search didn’t really turn up much, far less than if you were to search up something like ‘Nancy Pelosi nude’ even, it kind of seems overblown and the only reason it’s gotten any news is because of who it happened to. Just being famous nowadays seems like you’re just going to see photoshopped or deepfake porn of yourself spread all over the internet.

  • ObsidianZed@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    9 months ago

    People have been doing these for years even before AGI.

    Now, it’s just faster.

    Edit: Sorry, I suppose I should mean LLM AI

      • Thomrade@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        9 months ago

        I think they might have used AGI to mean “AI Generated Images” which I’ve seen used in a few places, not know that AGI is already a term in the AI lexicon.

    • Imgonnatrythis@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      According to the article it’s impossible, you will probably see these on a billboard on your way into work. According to real life if you spend several hours looking for them you might find them on some 4chan sub somewhere. If you don’t know to avoid corners of the internet like that already, there isn’t any hope.