• Juice@midwest.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    Does anyone else hear that? Its the worlds smallest AI violin playing the saddest song composed by an AI

  • patacon_pisao@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    Wow, I just chatted with a coworker about AI, and I told them it was crazy how it uses copyrighted content to create something supposedly “new,” and they said “well how would we train the AI without it?” I don’t think we should sacrifice copyright laws and originality for the sake of improving profits as they tell us it’s only to “improve the experience.”

  • Babalugats@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    If they get this, I’m gonna make s fortune ripping the copyright protection off stuff so that I can sell products as my own.

  • PixeIOrange@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    8 months ago

    “because it’s supposedly “impossible” for the company to train its artificial intelligence models — and continue growing its multi-billion-dollar-business — without them.”

    O no! Poor richs cant get more rich fast enough :(

  • xelar@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    8 months ago

    Unregulated areas lead to these type of business practices where the people will squeeze out the juices of these opportunities. The cost of these activities will be passed on the taxpayers.

  • Petter1@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    That gonna be fun if they manage to make movie makin AI and suddenly all actors appear on the resulting content Big money vs big money 😮

  • SankaraStone@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    Isn’t copyright about the right to make and distribute or sell copies or the lack there of? As long as they can prevent jailbreaking the AI, reading copyrighted material and learning from it to produce something else is not a copyright violation.

  • 5paceThunder@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    If openai gets to use copyrighted content for free, then so should every one else.

    If that happens, no point making anything, since your stuff will get stolen anyway

    • bitwaba@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      I’m okay with it if they do some kind of open source GPL style license for the copyrighted material, like you can use all the material in the world to train your model, but you can’t sell your model for money if it was trained on copyrighted material.

        • Rekorse@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 months ago

          To clarify, you can sell GPL licensed programs but any GPL licensed software us inherently worth 0$, because the first person that buys it is now able to give it away for free.

    • Dkarma@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Everyone else does. Name one thing you have to pay for to view on the internet…lmfao

    • floofloof@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 months ago

      If that happens, no point making anything, since your stuff will get stolen anyway

      From a capitalist’s point of view, yes, but we need a society that enables people to act from other incentives than making money. And there are plenty of other reasons to make things.

    • average_joe@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      Yes, people turned a blind eye towards OpenAI because they were supposedly an “open” non-profit company working towards the benefit of humanity. They got tons of top talent and investment because of this. The crazy part is that they knew they weren’t gonna keep it non-profit based on the internal chat revealed in Elon Musk’s lawsuit. They duped the whole world and now just trying to make as much money as possible.

      In my opinion, if they were to release these models openly and everyone had equal opportunity to benefit from their research (just like the previous research their current stuff is based on), they could be excused but this for-profit business model is the main problem.

  • Phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    Well alright then, that means you have the wrong business model, sucks to be you, NEXT.

  • Redruth@feddit.nl
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    its a threat. they hold the biggest players to ransom by demanding exclusive contracts. otherwise business goes to china, russia and qatar

  • mm_maybe@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    What irks me most about this claim from OpenAI and others in the AI industry is that it’s not based on any real evidence. Nobody has tested the counterfactual approach he claims wouldn’t work, yet the experiments that came closest–the first StarCoder LLM and the CommonCanvas text-to-image model–suggest that, in fact, it would have been possible to produce something very nearly as useful, and in some ways better, with a more restrained training data curation approach than scraping outbound Reddit links.

    All that aside, copyright clearly isn’t the right framework for understanding why what OpenAI does bothers people so much. It’s really about “data dignity”, which is a relatively new moral principle not yet protected by any single law. Most people feel that they should have control over what data is gathered about their activities online, as well as what is done with those data after it’s been collected, and even if they publish or post something under a Creative Commons license that permits derived uses of their work, they’ll still get upset if it’s used as an input to machine learning. This is true even if the generative models thereby created are not created for commercial reasons, but only for personal or educational purposes that clearly constitute fair use. I’m not saying that OpenAI’s use of copyrighted work is fair, I’m just saying that even in cases where the use is clearly fair, there’s still a perceived moral injury, so I don’t think it’s wise to lean too heavily on copyright law if we want to find a path forward that feels just.