Don’t learn to code: Nvidia’s founder Jensen Huang advises a different career path::Don’t learn to code advises Jensen Huang of Nvidia. Thanks to AI everybody will soon become a capable programmer simply using human language.

  • Wooki@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    10 months ago

    This overglorified snake oil salesmanman is scared.

    Anyone who understands how these models works can see plain as day we have reached peak LLM. Its not enshitifying on itself and we are seeing its decline in real time with quality of generated content. Dont believe me? Go follow some senior engineers.

      • Wooki@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        10 months ago

        You asked the question already answered. Pick your platform and you will find a lot of public research on the topic. Specifically for programming even more so

      • Wooki@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        10 months ago

        Fediverse is sadly not as popular as we would like sorry cant help here. That said i follow some researchers blogs and a quick search should land you with some good sources depending on your field of interest

      • thirteene@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        There is a reason they didn’t offer specific examples. LLM can still scale by size, logical optimization, training optimization, and more importantly integration. The current implementation is reaching it’s limits but pace of growth is also happening very quickly. AI reduces workload, but it is likely going to require designers and validators for a long time.

        • Wooki@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          10 months ago

          For sure evidence is mounting that model size benefit is not returning the quality expected. Its also had the larger net impact of enshitifying itself with negative feedback loops between training data, humans and back to training. This one being quantified as a large declining trend in quality. It can only get worse as privacy, IP laws and other regulations start coming into place. The growth this hype master is selling is pure fiction.

          • msage@programming.dev
            link
            fedilink
            English
            arrow-up
            0
            ·
            10 months ago

            But he has a lot of product to sell.

            And companies will gobble it all up.

            On an unrelated note, I will never own a new graphics card.

            • Wooki@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              10 months ago

              Secondhand is better value, still new cost right now is nothing short of price fixing. You only need look at the size reduction in memory since A100 was released to know what’s happening to gpu’s.

              We need serious competition, hopefully intel is able to but foreign competition would be best.

              • msage@programming.dev
                link
                fedilink
                English
                arrow-up
                0
                ·
                10 months ago

                I doubt that any serious competitor will bring any change to this space. Why would it - everyone will scream ‘shut up and take my money’.