THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

  • Asifall@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    That’s arguably a better rule than the more traditional flat-fee penalties, as it curbs the impulse to treat violations as cost-of-business. A firm that makes $1B/year isn’t going to blink at a handful of $1000 judgements.

    No argument there but it reinforces my point that this law is written for Taylor swift and not a random high schooler.

    You’d be liable for producing an animated short staring “Definitely Not Mickey Mouse” under the same reasoning.

    Except that there are fair use exceptions specifically to prevent copyright law from running afoul of the first amendment. You can see the parody exception used in many episodes of south park for example and even specifically used to depict Mickey Mouse. Either this bill allows for those types of uses in which case it’s toothless anyway or it’s much more restrictive to speech than existing copyright law.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      written for Taylor swift and not a random high schooler.

      In a sane world, class action lawsuits would balance these scales.

      there are fair use exceptions specifically to prevent copyright law from running afoul of the first amendment

      Why would revenge porn constitute fair use? This seems more akin to slander.

      • Asifall@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        You keep referring to this as revenge porn which to me is a case where someone spreads nudes around as a way to punish their current or former partner. You could use AI to generate material to use as revenge porn, but I bet most AI nudes are not that.

        Think about a political comic showing a pro-corporate politician performing a sex act with Jeff bezos. Clearly that would be protected speech. If you generate the same image with generative AI though then suddenly it’s illegal even if you clearly label it as being a parody. That’s the concern. Moreover, the slander/libel angle doesn’t make sense if you include a warning that the image is generated, as you are not making a false statement.

        To sum up why I think this bill is kinda weird and likely to be ineffective, it’s perfectly legal for me to generate and distribute a fake ai video of my neighbor shooting a puppy as long as I don’t present it as a real video. If I generate the same video but my neighbor’s dick is hanging out, straight to jail. It’s not consistent.

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          where someone spreads nudes around as a way to punish their current or former partner

          I would consider, as an example, a student who created a vulgar AI porn display of another student or teacher out of some sense of spite an example of “revenge porn”. Same with a coworker or boss trying to humiliate someone at the office.

          Think about a political comic showing a pro-corporate politician performing a sex act with Jeff bezos.

          That’s another good example. The Trump/Putin kissing mural is a great example of something that ends up being homophobic rather than partisan.

          it’s perfectly legal for me to generate and distribute a fake ai video of my neighbor shooting a puppy

          If you used it to slander your neighbor, it would not be legal.

          • Asifall@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            3 months ago

            That’s another good example. The Trump/Putin kissing mural is a great example of something that ends up being homophobic rather than partisan.

            So you think it should be illegal?

            If you used it to slander your neighbor, it would not be legal.

            You’re entirely ignoring my point, I’m not trying to pass the video off as real therefore it’s not slander.

            • UnderpantsWeevil@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              3 months ago

              So you think it should be illegal?

              I think it’s an example of partisan language that ends up being blandly homophobic.

              You’re entirely ignoring my point

              Why would putting up a giant sign reading “My neighbor murders dogs for fun” be a tort but a mural to the same effect be protected?