• RepleteLocum@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 days ago

    The thing that takes inputs gargles it together without thought and spits it out again can’t be intelligent. It’s literally not capable of it. Now if you were to replicate the brain, sure, you could probably create something kinda „smart“. But we don’t know shit about our brain and evolution took thousands of years and humans are still insanely flawed.

    • Telodzrum@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 days ago

      Yup, AGI is terrifying; luckily it’s a few centuries off. The parlor-trick text predictor we have now is just bad for the environment and the economy.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 days ago

        Eh, probably not a few centuries. I could be, IDK, but I don’t think it makes sense to quantify like that.

        We’re a few major breakthroughs away, and breakthroughs generally don’t happen all at once, they’re usually the product of tons of minor breakthroughs. If we put everyone a different their dog into R&D, we could dramatically increase the production of minor breakthroughs, and thereby reduce the time to AGI, but we aren’t doing that.

        So yeah, maybe centuries, maybe decades, IDK. It’s hard to estimate the pace of research and what new obstacles we’ll find along the way that will need their own breakthroughs.

        • Telodzrum@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 days ago

          We’re a few major breakthroughs away

          We are dozens of world-changing breakthroughs in the understanding of consciousness, sapience, sentience, and even more in computer and electrical engineering away from being able to even understand what the final product of an AGI development program would look like.

          We are not anywhere near close to AGI.