First & Lasting SC25 Impression: It’s a Liquid Cooled World Now
The first thing that struck me when I entered the show floor at SC25 is that it’s big. Very big. There were over 16,000 attendees, all of whom seemed intent on wandering into my path while I was power-walking from one corner of the massive exhibit hall to the exact opposite side for my next meeting.
There were 560 exhibitors — a conference record. Of those 560 exhibitors, at least 729 of them appeared to be pushing liquid cooling. Seriously. The mass of pipes, connectors, valves, and gauges on display would be enough to equip several mid-sized city fire departments.
Five times I stopped, did a 360, and tried to find a spot on the floor without seeing a liquid cooling booth. All five times I failed. Three times I saw more than one. Damn.
I became a fanboy of liquid cooling quite a while back. I saw clusters becoming denser and component TDPs rising, and realized there’s a hard limit to what air cooling can do. I drank the Kool-Aid by direct-liquid-cooling my own rigs and was very happy with the ease of installation and performance. Plus, I stopped sweating through my chair in summer.
Liquid Cooling: It’s Time
Talking to datacenter types about liquid cooling even just a few years ago usually triggered the standard objections: “Water? Inside my systems? It’ll leak and destroy everything!” and “That’s got to be insanely expensive — we’d have to tear up walls, floors, and ceilings for pipe runs.”
Here’s the deal: The age of air cooling is over. Why? Physics and economics.
Computer equipment is incredibly efficient at turning electricity into heat. A CPU that consumes 125 watts produces 125 watts of waste heat. CPUs used to be modest. Now AMD’s top server chips have 192 cores and can gulp down 500 watts at full load. Add in modern GPUs that easily hit 700–800 watts each (and will top 1,000 watts in the next generation), and the heat numbers get ridiculous.
The Economics Angle
Liquid cooling has a net positive TCO over time. Radically higher system TDPs generate radically higher waste heat. Getting 85%+ of that heat out via liquid transfer dramatically lowers the load on your room cooling systems. Systems that are liquid cooled also need far fewer fans, which themselves consume meaningful power.
It takes much less electricity to run pumps and rooftop dry coolers than to chill the entire volume of a datacenter. You’re cooling the hot bits directly instead of the air around them.
Payback time depends on your electric rates and scale, but I’ve heard numbers as low as 18 months in high-cost markets. Three to five years is more typical.
Bottom line: The shift to liquid cooling isn’t coming — it’s already here. The question for most data center operators is no longer “if,” but “how quickly and how well” they can make the transition.