Hi, I’m building a homelab watercooled unix server. I don’t want to buy expensive overpriced pre-mixes from ekwb or aquatuning. What cooling solution do datacenters use for water cooling?
What is the chemical solution? Does anyone know?
There’s no way you need whatever you’re looking for.
What cooling solution do datacenters use for water cooling?
They typically don’t. The servers are air cooled and the room is conditioned with a Liebert or similar HVAC system. Liquid cooling servers is not practical or warranted for most situations.
That’s not entirely true, some do in fact use water cooling. There’s even “of the shelf” solutions from Supermicro.
https://www.supermicro.com/en/solutions/liquid-cooling
It’s not widespread, but it’s not inexistent.
Expanding on that, direct water cooling becomes more common the higher power density the racks are.
So as you get into 35kW+ racks it becomes the only way to get that much heat out, lots of GPU compute racks are water cooled by default now, the El Capitan super computer is entirely cooled through direct liquid interfaces, for example.
In ours, the coolant is referred to as “PG25” (distilled water with 25% propylene glycol, plus corrosion inhibitors and other additives). It’s widely available, and pre-mixed so it just gets poured straight in.
Your problem is going to be quantity. it might be cheaper per unit, but buying less than a 200 litre drum (if not a 1000 litre IBC) will prove to be a challenge.
I’d suggest a rethink, honestly.
is liquid
cookingcooling really necessary? critical data centers I have worked in use swamp coolers. cheaper, more efficient, more reliable, uses same water as your house…edit: d’oh! :)
liquid cooking?
Haha, stupid sexy autocorrect, or honest typo, either way I got a good chuckle!
oh man! I just poked ptsf@lemmy.world for a Austria!=Australia flub in another thread… my come uppins!
The cooler is made of lava 🔥
Haha, carry on 👍
Are you sure they use swamp coolers? Also known as evaporative coolers, they add moisture into dry air, making the air they are cooling very humid and only slightly cooler.
Practically all even semi-modern DCs are built for servers themselves to be air cooled. The air itself is cooled via a heat exchanger with a separate and isolated chiller and cooling tower. The isolated chiller is essentially the swamp cooler, but it’s isolated from the servers.
There are cases where servers are directly liquid cooled, but it’s mostly just the recent Nvidia GPUs and niche things like high-frequency-trading and crypto ASICs.
All this said… For the longest time I water cooled my home lab’s compute server because I thought it was necessary to reduce noise. But, with proper airflow and a good tower cooler, you can get basically just as quiet. All without the maintenance and risk of water, pumps, tubing, etc.
a chiller is not a swamp cooler.
picture a fan with a wet sponge in front of it… that is a swamp cooler.
Yea, it’s the combo of the chiller and cooling tower is analogous to a swamp cooler. The cooling tower provides the evaporative cooling. The difference is that rather than directly cooling the environment around the cooling tower, the chiller allows indirect cooling of the DC via heat exchange. And isolated chiller providing heat exchange is why humidity inside the DC isn’t impacted by the evaporative cooling. And sure, humidity is different between hot and cold isles. That is just a function of temperature and relative humidity. But, no moisture is exchanged into the DC to cool the DC.
Edit: Turns out I’m a bit misinformed. Apparently in dry environments that can deal with the added moisture, DCs are built that indeed use simple direct evaporative cooling.
Industrial cooling towers are usually evaporative in my experience, smaller ones are large fans moving air over a stack of slats that the return water is sprayed or piped over and the collects in well for recirculation, larger ones afaik (like what you’d see at power plants) operate the same idea. Top ups and water chemistry is all automated.
Those systems have operation wide cooling loops that individual pieces of equipment tap into, some stuff uses it directly (see that with things like industrial furnaces) but smaller stuff or stuff that’s sensitive you’ll see heat exchangers and even then the server & PLC rooms were all air cooled, the air cons for them were all tied into the cooling water loops though.
From a maintenance POV though, way easier to air cool, totally seen motor drive racks with failed cooling fans that have had really powerful external blowers rigged up to keep them going to the next maintenance window. Yeah, industrial POV but similar idea.
critical data centers use swamp coolers because they don’t have to treat the water or expose it to contamination from outside. they use straight domestic water… super cheap.
if the conductivity gets too high, they dump the basin and fill with fresh… rinse and repeat.
yes. I programmed and integrated swap coolers at Amazon data centers. when the cool air hits the hot aisles the humidity goes down.
That is awesome! I had no idea swamp coolers could cool that well. The one in my shop can barely drop the temp 10-15 degrees below outside (on a good day). Sorry for doubting you, so used to people outside of arid climates not knowing what a swamp cooler is.
10-15 degrees is all you need to keep a “cold aisle” at 85degf, most places, on the worst day.
IIRC Amazon figured out that individual components could actually run hotter within an acceptable replacement window.
higher equipment replacement is more than offset by the fact they don’t have to do refrigerant based cooling which makes daily operation ridiculously cheap… no pumps or complicated mechanical devices to produce cooling… no people with special skills to maintain them, etc.
That 10-15 (in my case) relies on no extra heat. When I have game nights it gets pretty toasty inside with 5 or so extra bodies in the shop.
My last experience with a server room was in 2002 or 2003, and the rooms were kept in the mid to low 60s.
no doubt.
I remember when you’d put a jacket on before you went in the halls… but now everyone wears shorts.
Water cooling is typically much more complex and expensive than air cooling, and is mainly attractive because of space limitations. The same applies to data centers. IBM’s mainframes have a liquid cooled version mainly targeted towards users wishing to get the most out of their data center space before upgrading sites. These ship without coolant, and simply ask the user to “just add water,” i.e. just demineralised/distilled water.
Sure Mainframe ain’t dead, but what about that toilet water? | Aussie Storage Blog - https://aussiestorageblog.wordpress.com/2021/04/07/sure-mainframe-aint-dead-but-what-about-that-toilet-water/
deleted by creator
If you are willing to deal with potential galvanic corrosion, condensation, leaks, replacing fluid every now and then and so on I suppose you could use red or yellow radiator coolant from your local car service shop. It has all the properties you’ll want from a coolant liquid, but as others have already mentioned, it’s not really worth the hassle. Atleast if you’re not running something really power hungry, like GPU farm. And even with liquid you have the very same problems than air, you need the heat to go somewhere. So either have very long pipes (and pumps to match them) so you can have the radiator on a different room/outside or big fans to move the hot air away from radiator.
I have no idea what a data center would use and I haven’t over locked since the 90s, but water wetter is what I used then.