Pro@programming.dev to Technology@lemmy.worldEnglish · 30 天前Google quietly released an app that lets you download and run AI models locallygithub.comexternal-linkmessage-square44fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1external-linkGoogle quietly released an app that lets you download and run AI models locallygithub.comPro@programming.dev to Technology@lemmy.worldEnglish · 30 天前message-square44fedilink
minus-squareGreg Clarke@lemmy.calinkfedilinkEnglisharrow-up0·29 天前Has this actually been done? If so, I assume it would only be able to use the CPU
minus-squareEuphoma@lemmy.mllinkfedilinkEnglisharrow-up0·29 天前Yeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk
You can use it in termux
Has this actually been done? If so, I assume it would only be able to use the CPU
Yeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk