• Kid_Thunder@kbin.social
    cake
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    It’s already here. I run AI models via my GPU with training data from various sources for both searching/GPT-like chat and images. You can basically point-and-click and do this with GPT4All which integrates a chat client and let’s you just select some popular AI models without knowing how to really do anything or use the CLI. It basically gives you a ChatGPT experience offline using your GPU if it has enough VRAM or CPU if it doesn’t for whatever particular model you’re using. It doesn’t do images I don’t think but there are other projects out there that simplify doing it using your own stuff.

    • funkajunk@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      I was meaning for mobile tech, running your own personal AI on your phone.

    • cybersandwich@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      The m series Mac s with unified memory and ML cores are insanely powerful and much more flexible because your 32gb of system memory is now GPU vram etc