Not sure if this goes here or if this post will be hated upon? but i want to host ai like llms and comfyuis newer models locally but im not sure what type of setup or parts would work best on a possible slim budget? im not sure either if now is the time with inflation and such.

I dont have a price in mind yet but im wondering how much it would cost or what parts i may need?

If you have any questions or concerns please leave a comment.

  • paper_moon@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    13 hours ago

    I wonder if it took into account when generating the price estimated, all the hikes in RAM pricing that it itself is causing…🤔

    Stupid fucking AI data centers…