The vast data centers that power artificial intelligence are so energy hungry that they’re heating up their surroundings, according to new research. It’s an alarming finding given the number of data centers is predicted to explode over the next few years.



Come to think of it, you would think power generation would perceptively increase the temperature in the surrounding air and while I have heard of nuclear plants making bays or rivers a few degrees warmer near the plant I don’t think it’s even been described as this bad.
Putting on my ME hat. Power systems are designed to extract the most amount of energy of their fuel sources. This leads for waste heat to not be significantly above ambient temperatures as they have extracted about as much as they can from it.
I assume now many high end servers are now water cooled, so that probably improves their cooling performance of their system, vs the older ways of aisle cooling and having hot and cold sides of the rack.
But unlike power stations, I don’t think the data centers have incentives or the ability to extract much useful work from their waste heat, so I’m sure they are dumping it as fast as they can plus the waste heat from all the secondary systems, cooling, water pumps, etc. are just compounding the local air temps.