This morning, the news broke that Larian Studios, developer of Baldur's Gate 3 and the upcoming, just-announced Divinity, is apparently using generative AI behind the scenes. The backlash has been swift, and now Larian founder and game director Swen Vincke is responding to clarify his remarks.
Data centers typically use closed loop cooling systems but those do still lose a bit of water each day that needs to be replaced. It’s not much—compared to the size of the data center—but it’s still a non-trivial amount.
A study recently came out (it was talked about extensively on the Science VS podcast) that said that a long conversation with an AI chat bot (e.g. ChatGPT) could use up to half a liter of water—in the worst case scenario.
This statistic has been used in the news quite a lot recently but it’s a bad statistic: That water usage counts the water used by the power plant (for its own cooling). That’s typically water that would come from ponds and similar that would’ve been built right alongside the power plant (your classic “cooling pond”). So it’s not like the data centers are using 0.5L of fresh water that could be going to people’s homes.
For reference, the actual data center water usage is 12% of that 0.5L: 0.06L of water (for a long chat). Also remember: This is the worst-case scenario with a very poorly-engineered data center.
Another stat from the study that’s relevant: Generating images uses much less energy/water than chat. However, generating videos uses up an order of magnitude more than both (combined).
So if you want the lowest possible energy usage of modern, generative AI: Use fast (low parameter count), open source models… To generate images 👍
Data centers typically use closed loop cooling systems but those do still lose a bit of water each day that needs to be replaced. It’s not much—compared to the size of the data center—but it’s still a non-trivial amount.
A study recently came out (it was talked about extensively on the Science VS podcast) that said that a long conversation with an AI chat bot (e.g. ChatGPT) could use up to half a liter of water—in the worst case scenario.
This statistic has been used in the news quite a lot recently but it’s a bad statistic: That water usage counts the water used by the power plant (for its own cooling). That’s typically water that would come from ponds and similar that would’ve been built right alongside the power plant (your classic “cooling pond”). So it’s not like the data centers are using 0.5L of fresh water that could be going to people’s homes.
For reference, the actual data center water usage is 12% of that 0.5L: 0.06L of water (for a long chat). Also remember: This is the worst-case scenario with a very poorly-engineered data center.
Another stat from the study that’s relevant: Generating images uses much less energy/water than chat. However, generating videos uses up an order of magnitude more than both (combined).
So if you want the lowest possible energy usage of modern, generative AI: Use fast (low parameter count), open source models… To generate images 👍