So I have this silly idea/longterm project of wanting to run a server on renewables on my farm. And I would like to reuse the heat generated by the server, for example to heat a grow room, or simply my house. How much heat does a server produce, and where would you consider it best applied? Has anyone built such a thing?

  • dillekant@slrpnk.net
    link
    fedilink
    arrow-up
    0
    ·
    7 months ago

    I do this. If you want to actually want to use or donate the processing power, this is kind of a good thing. However, there are a lot of downsides:

    • Computers are generally much lower power than a heater. This makes them very slow to “react” to heating needs. Heating a small room, even with a 500W PC, could take an hour or maybe more.
    • Heaters have a thermostat, which computers don’t, so even though they are very laggy, they also don’t stop heating when the temperature is right. This means they can overshoot and make the room uncomfortably hot.
    • You could set up an external thermostat but then you need a load which can be switched on and off.
    • I was using folding@home, but the work items take a long time, and switching them on and off will increase the time taken to resolve the work item, which in turn means the system could get annoyed and use someone else’s computer to resolve the work item faster, or worse, blacklist your computer.
    • Using your PC to generate heat will use up its maximum lifetime. The fans aren’t built to be running at max speed all the time, the CPU & GPU could wear out, and the power systems will also wear as time goes on. You sort of have to align that lifetime against usage. This is likely fine if you see the computation as a donation or if you have important stuff to compute, but it’s probably not worth just wasting the cycles.