• WorldsDumbestMan@lemmy.today
      link
      fedilink
      arrow-up
      6
      ·
      21 hours ago

      I run it on the hardware I have, the data stays with me, offline. I run it on hardware I already have for other purposes, and I even have a portable solar panel (but I use the plug socket).

      • WamGams@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        ·
        18 hours ago

        doesn’t AI need like 96 gigs of ram to be comparable in quality (or lack there of, depending on how you view it) yo the commercial options?

        • WorldsDumbestMan@lemmy.today
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          7 hours ago

          8B QWEN for example. It’s limited, and can’t “reason” almost at all, but it does give well-structured answers. You can extract references from it, up to a point.

        • Xylight@feddit.online
          link
          fedilink
          English
          arrow-up
          4
          ·
          18 hours ago

          Qwen3 30b a3b, for example, is brilliant for its size and i can run it on my 8 GB VRAM + 32 GB RAM system at like 20 tokens per second. For lower powered systems, Qwen3 4b + a search tool is also insanely great for its size and can fit in less than 3 GB of RAM or VRAM at Q5 quantization