Archived copies of the article:
An unholy amount.
An amount guaranteed to spike climate targets a decade early.
Stoopid much.
The oil industry must be so giddy to have found a new scape goat out of nowhere.
Datacenters take a lot of energy because they serve a lot of people. The impact can be lessened with a proper grid centered around renewable.
There are actual things that are fucking up the planet, individuals using AI, gaming or having a Google account aren’t the actual issue.
You are correct that renewable energy would help but if huge amounts of power are specifically being drawn for AI data centers that is part of the equation. Just like it’s reduce/reuse/recycle in that order for handling items, it should be reduce/renewable for power, and we should have to build the renewable infrastructure before building more data centers.
Data center demand has created huge backlog of gas turbine orders. They’re not planning on renewables for the next big expansion
Ai is destroying our planet. Stop fucking using it.
I read somewhere recently that AI data center open loop water cooling systems drain 100 million liters of freshwater a day and evaporate it away.
Would you mind sharing where you read that?
This isn’t the article I read but it has tons of info about this:
https://www.bloomberg.com/graphics/2025-ai-impacts-data-centers-water-data/
It’s so annoying when you try to discuss this because often a gaggle of idiots come out and point, superficially, that water gets recycled into nature. They always ignore the cost of making that water fit for human usage.
Butler was (will be?) right!
A problem is that the information is not in the hands of the company selling the AI. The actual hardware is often owned by service providers and independent data centers.
They know exactly what the power consumption of that hardware is though. This isnt tough to figure out just because you use a cloud provider
Well, I work at an AI hyperscaler. I can tell you how much my facility uses, and how much each rack uses, but don’t have any way to determine what the customer is doing on that server. Or even which servers a given customer is using. Is it being used heavily for queries? How many? Of what kind? We don’t know. Only what the rack/row/pod/hall is consuming.
Also, does the network gear overhead count? How do you apportion that?
We have no visibility into the customer workload. Some of our customers use our systems for scientific research. Drugs, etc. How do you tally that?
I’m not saying that it is impossible, just that if the customer won’t pay for that report, we’re not going to spend money to build the systems to produce it.
Do I agree? No. But I’m just a grunt.
Im sure they can do the simple math of: we pay for x power, we have y customers. x / y would be a rough but probably pretty accurate number if we are talking tens of thousands to millions of customers.
You can produce a remarkably good estimate by looking at CPU and GPU utilization out of procfs and profiling a handful of similar machines power use with similar utilization and workloads.
Network is less than 5% of power use for non-GPU loads; probably less for GPU.
at least 3
This maths
So much that now they want to turn nuclear reactors back on. Not because it’s green energy but because it’s free energy for them
It’s not free, but it’s the good part nonetheless - nuclear energy and thus increase in people trained to operate and build nuclear reactors.
Nuclear energy is, planning-wise, very high quality, you have a lot of control in scaling the output.
That allows, together with lots of accumulators of various kinds (pumping water up and such), to actually make renewables with uncontrollable output useful.
Making the average cost of energy better than just that of nuclear.
So, when Microsoft dies, those reactors and people will be of value.