Not sure if this goes here or if this post will be hated upon? but i want to host ai like llms and comfyuis newer models locally but im not sure what type of setup or parts would work best on a possible slim budget? im not sure either if now is the time with inflation and such.
I dont have a price in mind yet but im wondering how much it would cost or what parts i may need?
If you have any questions or concerns please leave a comment.


If your focus is LLMs, get a 3090 gpu. Vram is the most important thing here because it determines what models you can load and run at a decent speed, and having 24Gb will let you run the mid range models that specifically target this amount of memory because of this being a very standard amount to have for hobbyists. These models are viable for coding, the smaller ones are less so. Looking at prices it seems like you can get this card for 1-2k depending on if you go used or refurbished. I don’t know if better price options are going to be available soon but with the ram shortage and huge general demand it kind of doesn’t seem like it.
If you want to focus on image or video generation instead, I understand that there are advantages to going with newer generation cards because certain features and speed is more of a factor than just vram but I know less about this.