Net-zero emission goals went out the window with AI.
In this world, we obey the law of thermodynamics. I’d love to know how this 3 bottles of water is “consumed”. Because more than likely, the water is simply being used for cooling, which doesn’t consume it at all, it just makes it warmer.
Yeah the article is disingenuous at best. There are many things wrong with generative AI, but this is just a lousy approach.
If I make a PC, put in a water cooling loop, and use it to run an LLM - sure, water is circulating, but that water isn’t just vanishing lol.
My friend, you are naive at best if you think AI data centers are using closed loop water cooling. Look up evaporative cooling towers. It’s “consumed” in the sense that it is evaporated.
I specifically avoided saying they did because I wasn’t knowledgeable on the topic. But I agree, I could equally be accused of being disingenuous by phrasing it in a way that could lead people to assume they use closed loops.
I did look those up, and while evaporation cooling isn’t the only method used, it also doesn’t evaporate all the water each pass, only a portion of it (granted “a portion” is all I found at a quick look, which isn’t actually useful).
I do agree though, the water usage is excessive, and when though that water only “changes forms”, it’s still removes it from a water source and only some of it may make its way back in.
and it’s still absolute crap… the heat produced by 100 words of GPT inference is negligible - it CERTAINLY doesn’t take 3L of water evaporating to cool it
Do you have a study to publish? Because they do.
It consumes the resource of “purified, available water” which is consumed as it is no longer purified or unavailable (if evaporated). The same way nothing ever “consumes” energy, it just makes it unusable.
The water simply vanishes, consumed by the AI’s ever growing need for H Y D R A T I O N. /s
Wait… What? The article seems to imply that the water is consumed, but it’s referencing the water used in cooling loops.
Data centers don’t have “water cooling loops” that are anything like the ones in consumer PCs. To maximize cooling capacity, a lot of the systems use some sort of evaporative cooling that results in some of the water just floating away into the atmosphere (after which point it would need to be purified again before it could be used for human consumption)
It also seems from what I can find like some data centers just pipe in clean ambient-temperature water, use it to cool the servers, and then pipe it right back out into the municipal sewer system. Which is even more stupid, because you’re taking potable water, sending it through systems that should be pretty clean, and then mixing it with waste water. If anything, that should be considered “gray water”, which is still fine to use for things like flushing toilets.
As with everything else, we need the government to regulate it because otherwise the corporations don’t really give a shit.
I would be really surprised if anyone is cooling data centres with city water except in emergency, that’s so unbelievably expensive (could see water direct from a lake though but that had it’s own issues too). I recall saving millions just by adjusting a fill target on an evaporative cooling tower so it wouldn’t overfill (levels were really cyclic, targets weren’t tuned for them), and that was only a fraction of what it’d have cost if we’d’ve used pure city.
deleted by creator
My work drilled water wells for evaporative cooling in their datacenter.
water supply is a limited resource, everyone here appears to be focusing on the wrong thing. when a data center uses water in its cooling noops, that water is made inaccessible anywhere else, such as agriculture, natural habitats, drinking. it does not matter (directly) that the water technically is potable or not after use. Very little water ever leaves the earth system, yet drought exists.
Huh. I run a LLM locally on my own machine. Not looking forward to my next water bill.
Have you checked your computer’s gallons per hour? I’m thinking of getting an electric myself.
It’s almost like these “services” are an unnecessary blight that benefit only those that profit financially from them.
that, is what a service always was and will continue to be, live service games, service jobs, telco service. this isn’t new, it just affects more people
We need municipal datacenters that can be integrated into the municipal water departments, and municipal electrical grid. Use the hot water to provide ‘on tap’ hot water for local businesses that need it.
Jup, way more of these kinds of solutions are needed. But data enters usually add stuff to the water to make it cool better and make it undrinkable. But the whole ordeal just shows water and power is too cheap for these kinds of uses.
slrpnk.net needs you
The Excel spreadsheet that calculates this has so many ‘assumption’ cells.
How many bottles of water does generating a bottle of water consume? Checkmate water bottle and water bottle related statistical analysis enthusiasts.
Can anyone explain the conversion from “a bottle of water” to something like kWh?
Sorry for the delayed response, it took me a while to do the calculations but I finally figured it out:
It’s magic.
I hope this helps.
I’m not 100% down with these numbers. The verge has a breakdown of energy usage for generation and training, and you could argue that demand is responsible for training.
I would also argue that energy usage would be directly related to water usage. Unless there is passive cooling unrelated to the energy generation, the evaporation would be directly related to the energy cost.
I didn’t collect sources while I was coming this, but I found that it takes about .3 KWH to generate an AI image - about the same as fully charging a smartphone. 1kwh is ~860 kCal (one Calorie = kCal = 1000 calories) 1 image is ~282 Calories 1 Calorie heats 1 liter of water 1 degree It takes ~540 Calories to vaporise 1 liter 2 images vaporize a liter of water There are ~30k liters in an 18 foot above ground pool with 4ft of water. There were 15 billion images generated daily in May 2024 As of August 2023, people have generated almost 15.5 billion AI-generated images, and each day sees approximately 34 million new AI-generated images. 17 million liters vaporized daily, about 500 swimming pools This put my numbers at ~250k swimming pools vaporized so far.
<…> I found that it takes about .3 KWH to generate an AI image <…>
There isn’t really set power usage per image, since different models will take different amount of time. There are 100s of different factors and optional toggles that can increase or reduce time needed.
I was using a pretty broad brush. Some of the figures were higher, some were lower, but it many were consistently around .3. Feel free to take it with a grain of salt. Even if I’m over by 50%, it is still a large number.
my GPU can use 80 watts max and it takes 10 seconds to produce an image, that’s about 0.00022kwh, which is 1300 times less than what you said.
This may be rude, but we talked about this elsewhere in this thread.
oh, I thought you were a different person lol. 😭
it takes about 10 seconds for me to generate an image, my GPU can use 80 watts max. that’s 800 watt seconds or 0.0002222 kWh, if my math is correct
That’s a good point. I’m guessing my numbers might refer more to cloud providers than individuals with smaller data sets.
yea, i feel like theres a lot of misinformation around ai, both from ai bros and ai haters. im definitely not a fan of big corporations stealing the work of small artists, but it seems like most places have just become an ai hate circlejerk