• GrymEdm@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    8 months ago

    I 100% agree the genie is out of the bottle. People who want to walk back this change are not dealing with reality. AI and robotics are so valuable I very much doubt there’s even any point in talking about slowing it down. All that’s left now is to figure out how to use the good and deal with the bad - likely on a timeline of months to maybe one or two years.

    • _NoName_@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      That timeline of dealing with the bad looks incredibly optimistic. I imagine new issues will likely be regularly cropping up as well which we’ll also have to address.

      • GrymEdm@lemmy.worldOP
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        8 months ago

        I agree. I’m talking about how quickly we’re going to have strategies in place to deal not how quickly we’ll have it all figured out. My guess is we have at best a year before it’s a huge issue, and I agree with your take that figuring out human vs. AI content etc. is going to be an ongoing thing. Perhaps until AI gets so good it ceases to matter as much because it will be functionally the same.

    • Adanisi@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 months ago

      I’m personally waiting for legal cases to do with the use of AI trained on code, and whether the licenses apply to it.

      If they don’t, our GPL becomes almost useless because it can be laundered but at the same time we can begin using AI trained on code we don’t necessarily abide by the license terms of (maybe even decomps I don’t know how it’ll go). Fight fire with fire and all. So I’d maybe look into that.

      If they do, then I’ll probably still use it, but mainly with permissively licensed code, and code also under the GPL (as I use the GPL)

      And in both cases, they’d be local models, not “cLoUd” models run by the likes of M$

      Until then, I’m not touching it.