- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
You must log in or register to comment.
I would kill to run my models on my own AMD linux server.
Does GPT4all not allow that? Or do you have specific other models?
I haven’t super looked into it but I’m not interested in playing the GPU game against the gamers so if AMD can do a Tesla equivalent with gobs of RAM and no display hardware I’d be all about it.
Right now it’s looking like I’m going to build a server with a pair of K80s off ebay for a hundred bucks which will give me 48GB of RAM to run models in.
Some of the LLMs it ships with are very reasonably sized and still be impressive. I can run them on a laptop with 32GB of RAM.
This is very interesting! Thanks for the link. I’ll dig into this when I manage to have some time.