I realize, I need to upgrade my little NUC to something bigger for higher inference of bigger llama models. I want something that you still can have on your living room’s tv bench, so no monster rack please, but that has also the necessary muscle when needed for llama. Budget doesn’t matter right now, want to understand what’s good and what’s out there. Thanks

EDIT: Wow, thanks for the inspiration, guess I need to look at bit for “how to stuff a huge graphics card into a mini box”. To clarify a bit more what I want with it: I want to build a responsive personal assistant. I am dreaming of models bigger than 8B, good tool calling for things like memory, websearch etc., no coding, no image generation, no video generation required. Image recognition would be good but not a must. Regarding footprint, the no monster ;) Something that you can have in your livingroom, and could be wife approved - so no big gaming rig with exhaust pipes and stuff, needs to be good looking ;)

  • zergtoshi@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 hours ago

    Wouldn’t an AMD RX 9060 XT with 16 GB RAM be nice as well if you’re hunting for good speed/cost options?

    • TheHolm@aussie.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      47 minutes ago

      Probably. It just not as fast as 9070 XT. I’m using 9070 XT myself and limitation for running LLMs is memory, not speed. If model fit in memory it will runs fast enough to be practical.