• state_electrician@discuss.tchncs.de
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    Text, sure. But I don’t get the hate towards AI generated images. If it’s a good image and it’s not meant to mislead, I am completely fine with AI content. It’s not slop if it’s good.

      • state_electrician@discuss.tchncs.de
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        I am torn on that. If it’s a company making money off of it, despicable. If it’s an open source model used for memes? I’m fine with that. We shouldn’t act like artists follow some magical calling from god. Anything anyone creates is built on their education and the media they were exposed to. I don’t think generative models are any different.

        • liyunxiao@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          Normalizing is a thing, on top of that there are still indie markets that can be supplanted by gan image generation. On top of that artists still have rights to their work, if they didn’t explicitly license their works for the model, it’s theft that removes the value of the original.

        • liyunxiao@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          There’s a pretty clear difference in the two. If piracy ended in a new digital good that removes the market for the original good while eliminating the jobs of those that made the original good, then it’d be close. Even then pretty much everyone agrees not all piracy is the same; you wouldn’t pirate an indie game that hasn’t sold well unless you’re an absolute piece of subhuman shit.

          • Stovetop@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            5 months ago

            I really enjoyed the “Hobbit: Extended Edition” project which condensed the three films of the Hobbit trilogy down into a single film, and as an unofficial fan-made project, is only available online for free.

            Under that proposed gradient, I’m not sure where that would fall, given that it is a transformative work which uses the work of others to make them redundant (in this case, the original trilogy and the studios which would have otherwise profited from those sales).

            I feel like there’s a better way to divide it, but it will be difficult to negotiate the exact line against the long-held contradictory ideas that art should both be divorced from its creator once released but also that the creator is entitled to full control and profit until the expiry of its copyright.

          • Pika@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            5 months ago

            well uh, idk how to break it to you but it kinda does.

            Piracy doesn’t equal a 1:1 sale, that argument is true, however that argument works with both AI and piracy plus it goes both ways.

            The more people who do it via the free method, the less people who /may/ have bought it via the paid method. Meaning the less profit/earnings for the affected party.

            However, since it goes both ways, obtaining the item via the free method does not mean that they would have purchased the paid good if the free good wasn’t available.

            Both versions the original market is still available, regardless of method used.

            I highly disagree that piracy and AI are any different at least in the scenario you provided.

            if anything AI would be a morally higher ground imo, as it isn’t directly taking a product, it’s making something else using other products.

            Being said I believe that CC’s should be paid for the training usage, but that’s a whole different argument.

            • liyunxiao@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              0
              ·
              5 months ago

              It’s not solely about pay, but also what your work is used for. It makes sense you don’t understand this if you’ve never created anything, artwise or otherwise. If I draw a picture I control who displays that picture and for what purpose. If someone I don’t like uses that picture without permission it reflects poorly on me, and destroys my rights.

              The easy example is an art piece by a Holocaust survivor being used by a neonazi without permission.

              Now imagine you steal tens of millions of artists work. You know for a fact you don’t have the licenses needed to ensure their work is used to their liking.

        • liyunxiao@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          Yes, just because you disagree that your new toy is literally theft and is one of the most irresponsible inventions since leaded gasoline, that doesn’t change anything.

          Sorry you’re the type of person that added lead shot to your gas tank after they banned leaded gasoline.

  • Fedizen@lemmy.world
    cake
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    It should be fineable starting at like 500 dollars + any profits and ad revenue if its not labelled

  • gandalf_der_12te@discuss.tchncs.de
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    5 months ago

    political posts should have a tag as well, so people can filter them out. people just bluesky, pixelfed, … instead of lemmy because of all the politics here.

  • Majorllama@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    That might work for now when those of us who know what to look for can readily identify AI content for the time being, but there will be a time when nobody can tell anymore. How will we enforce the tagging then? Bad actors will always lie anyway. Some will accidentally post it without knowing its AI.

    I think they should add a tag for it anyway so those who are knowingly posting AI stuff can tag it but I fear that in the next few years the AI images and videos will be inescapable and impossible to identify reliably even for people who are usually good at picking out altered or fake images and videos.

    • Stovetop@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      I’d even be worried that bad actors would abuse the tag to take legitimate footage of something they want to discredit and reposting it everywhere with AI tags. They could take control of the narrative if enough people become convinced that it’s AI-generated and use that to flip the accusations back on the people trying to spread the truth.

      • Majorllama@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        Yeah unfortunately bad actors ruin pretty much everything. We can do our best as a society to set things up in a way where systems can’t be abused but the sad reality is we just need to raise people better.

        Lying, cheating (the academic or competitive integrity kind) and many other undesirable behaviors are part of human nature but good parenting teaches kids not to use those.

  • Spiderwort@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    Lots of stuff should be tagged but isn’t.

    Bad faith arguments. Rhetoric. Information gotten from dubious sources. Metaphors that might be taken literally… Lots of stuff.

    Yes, tags would be a great addition to common language. I’d put them right up there with emojis

  • pleasehavemylyrics@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    So… what you posted there could be photoshop or AI. In this context happily it doesn’t matter.

    Sadly there are going to be times when fake images are taken as real.

    And the tag would help.

    But criminals gonna do crime. They will not use a tag.

  • daniskarma@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    5 months ago

    Personally, I don’t really care.

    I do enjoy some types of AI content, I do not enjoy others. Same as any other type of content. So that tag would be useless for my personal preferences.

    Anyway nsfw tag is made not for moral reasons, but to avoid those images showing when you are in an environment that’s not proper for them (basically so it doesn’t look like you are watching porn at work), this makes no sense for AI content, So I don’t see the point besides some kind of persecution driven by a particular ideology. So I don’t support it.

  • Spiderwort@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    Maybe all digital content just shouldn’t be trusted. It’s like some kind of demon-realm or something. Navigable by the wise but for common fools like you and I, perilous. Full of illusion.

  • Theonetheycall1845@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    I am in complete agreement with this. While you can currently tell what’s AI it won’t be long before we’re scratching our heads wondering which way is up and which way is down. Hell, I saw an AI generated video of a cat cooking food. It looked real sortve.