I’m running ai on an old 1080 ti. You can run ai on almost anything, but the less memory you have the smaller (ie. dumber) your models will have to be.
As for the “how”, I use Ollama and Open WebUI. It’s pretty easy to set up.
I’m running ai on an old 1080 ti. You can run ai on almost anything, but the less memory you have the smaller (ie. dumber) your models will have to be.
As for the “how”, I use Ollama and Open WebUI. It’s pretty easy to set up.
deleted by creator
In Bill Lumbergh’s voice: If every American could just chip in $100k that would be greeaat.
My apps work fine, the ui is the same as stock, and I have no bloatware or ai crap invading every part of my phone.