Not sure if this goes here or if this post will be hated upon? but i want to host ai like llms and comfyuis newer models locally but im not sure what type of setup or parts would work best on a possible slim budget? im not sure either if now is the time with inflation and such.
I dont have a price in mind yet but im wondering how much it would cost or what parts i may need?
If you have any questions or concerns please leave a comment.


I was using a Nvidia 3060 for a while, then had 2 in one box, then switched to a 3090.
The amount of vram is a big factor for decent performance. Getting it to not sound like a predictably repetitive bot though is a whole separate thing that is still kind of elusive.
Does multiple GPU help? I could get a cheap 970 to toss in my rig
My go to for messing with chat bots is Kobold that’ll let you split the work between multiple GPUs. I get the impression the actual processing is only done on one but it lets you load larger models with the extra memory.