I run it on the hardware I have, the data stays with me, offline. I run it on hardware I already have for other purposes, and I even have a portable solar panel (but I use the plug socket).
8B QWEN for example. It’s limited, and can’t “reason” almost at all, but it does give well-structured answers. You can extract references from it, up to a point.
Qwen3 30b a3b, for example, is brilliant for its size and i can run it on my 8 GB VRAM + 32 GB RAM system at like 20 tokens per second. For lower powered systems, Qwen3 4b + a search tool is also insanely great for its size and can fit in less than 3 GB of RAM or VRAM at Q5 quantization
I am very forgetful, and googling takes forever. So if it does not sound bs, I just accept the answer. If the stakes are higher, I google it’s references.
Luckily, I have local AI.
And you should too!
With these ram prices?
Rather live without ai
I run it on the hardware I have, the data stays with me, offline. I run it on hardware I already have for other purposes, and I even have a portable solar panel (but I use the plug socket).
doesn’t AI need like 96 gigs of ram to be comparable in quality (or lack there of, depending on how you view it) yo the commercial options?
8B QWEN for example. It’s limited, and can’t “reason” almost at all, but it does give well-structured answers. You can extract references from it, up to a point.
Qwen3 30b a3b, for example, is brilliant for its size and i can run it on my 8 GB VRAM + 32 GB RAM system at like 20 tokens per second. For lower powered systems, Qwen3 4b + a search tool is also insanely great for its size and can fit in less than 3 GB of RAM or VRAM at Q5 quantization
What for? I can’t think of a single problem I have in my life where the answer is AI.
I am very forgetful, and googling takes forever. So if it does not sound bs, I just accept the answer. If the stakes are higher, I google it’s references.
???
I badly classify factures with a local ia. That bad but hey that still things I dont’ have to do