• PseudorandomNoise@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    Don’t you need specific CPUs for these AI features? If so, how is this going to work on the machines that don’t support it?

    • sacredbirdman@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      Nope, they can use your NPU, GPU or CPU whatever you have… the performance will vary quite a bit though. Also, the larger the model the more memory it needs to run well.

    • space@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      Running AI models isn’t that resource intensive. Training the models is the difficult part.

    • lemmyvore@feddit.nl
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      You only need lots of precessing power to train the models. Using the models can be done on regular hardware.

    • elliot_crane@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      With it being local it’s probably a small and limited model. I took a couple courses on machine learning years ago (before it got rebranded as “AI”), and you’d be surprised at how well a basic image recognition model can run on the lowest-spec macbook from 2012.

      • ferret@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        Tbh the inversion of typical intuition that is LLMs taking orders of magnitudes more memory than computer vision can mess people unfamiliar up on estimates of the hardware required