Not sure if this goes here or if this post will be hated upon? but i want to host ai like llms and comfyuis newer models locally but im not sure what type of setup or parts would work best on a possible slim budget? im not sure either if now is the time with inflation and such.

I dont have a price in mind yet but im wondering how much it would cost or what parts i may need?

If you have any questions or concerns please leave a comment.

  • ikidd@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 小时前

    I’ve really been mulling one of those over with 128GB. I’m on Claude Max and Cerebras $50 so I’m using a good amount of $200/mo for coding and Openclaw. Is it worth it for light coding, or are you only doing SD with it?

    • panda_abyss@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 小时前

      It would not be worth it as a replacement for Claude.

      80% of my issue is that it’s AMD and their drivers are still awful. 20% is that the token generation speed very slow, especially compared to commercial models running on dedicated hardware. MOE models are fine, dense models are too slow for meaningful workflows. ComfyUI is decent, but I’m not seriously into image gen.

      I have a lot of fun with it, but I have not been able to use it for any actual AI dev.