

0·
25 days agoI’m using Ollama to run my LLM’s. Going to see about using it for my twitch chat bot too
I’m using Ollama to run my LLM’s. Going to see about using it for my twitch chat bot too
My understanding is that DeepSeek still used Nvidia just older models and way more efficiently, which was remarkable. I hope to tinker with the opensource stuff at least with a little Twitch chat bot for my streams I was already planning to do with OpenAI. Will be even more remarkable if I can run this locally.
However this is embarassing to the western companies working on AI and especially with the $500B announcement of Stargate as it proves we don’t need as high end of an infrastructure to achieve the same results.
Somehow that makes it even worse in my opinion