The Data Center Density Debate: Generational Change Brings Higher Densities (datacenterfrontier.com) 45
1sockchuck writes: Over the past decade, there have been repeated predictions of the imminent arrival of higher rack power densities. Yet extreme densities have remained focused in high performance computing. Now data center providers are beginning to adapt their designs for higher densities. One of these companies is Colovore, which is among a cliuster of companies adopting chilled-water cooling doors for their cabinets (LinkedIn is another). They say the move to higher densities is driven in part by a generational change in IT teams, as younger engineers are less worried about high-density strategies using water in the data center. "A lot of them grew up with PC gaming and water cooling right in their living room," said a Colovore executive.
Re:How in the hell is this a DEBATE? (Score:4, Insightful)
How reliable is it, really? What happens if a pump breaks down in a high-density liquid cooled rack versus a breaking fan in a low-density air cooled rack?
How likely are those breakages? At the every least they are less than 100% reliable; they WILL break at some point. It might be perfectly fine, but a single PC at home does not scale to an entire datacenter of racks.
Re: (Score:2)
The last quote in the headline got me:
"A lot of them grew up with PC gaming and water cooling right in their living room," said a Colovore executive.
Face, meet palm. Although it's good to know I can cross Colovore off the list of datacentre candidates.
Re: (Score:2)
Re: (Score:1)
Re:How in the hell is this a DEBATE? (Score:4, Interesting)
Speaking from experience here running an HPC system with water cooled doors on the back of my racks for just shy of four years now. My personal view point is if you are not doing racks with water cooled doors then you need the sack unless you are using one of the in rack water cooling systems.
There is no pump in the rack/door. You hook up to the cool water supply that is coming from the chillers, and they provide the "pump". You are no worse off than before with air handling units in the room which also don't have pumps in them.
The doors we use have a number of large aka ~400mm diameter fans in the back that spin relatively slowly, and these are N+1 redundant. Besides that they are the sorts of fans than can spin for decades before going wrong.
The other advantage of having the cooling in the doors is that you save floor space in the room as you don't have to so many air handling units in the room.
The first point to note is you probably still need some air cooling in the room. Things like tape libraries don't come with water cooled doors for example. Though there is an argument that the tape libraries should go somewhere other than you data centre.
The second problem is suppliers that want to sell you something prepackaged in a rack that is not water cooled. Trying to explain to them that you don't want their shitty none water cooled rack is in my experience is like taking to a brick wall. Another reason why you probably still need air handling units. That said our doors are currently providing net cooling to the room, that is inlet temperature at the front of the rack is higher than the outlet temperature at the back.
The third problem is strip off to tee shirt and shorts if you open the rack door while the system is going full tilt, it can be sufficiently hot.
Now go crawl back under your rock and stop spouting nonsense about things you clearly know nothing about.
Re: (Score:1)
Out of curiosity, what sort of equipment are you running that requires the water cooling?
The article alludes to 20-30kW per cabinet, but that's pretty standard for air cooling these days. Cold isle, hot isle and ducted bottom to top cabinet cooling are the experience I have. We run cabinets of 4 x blade chassis with around 7kW per chassis at peak use.
The article also mentions dreams of 250kW per cabinet, but doesn't suggest anyone is doing that yet.
We run the rooms at a higher than usual ambient temperatur
Re: (Score:3)
Nothing requires water cooling, it is just a water cooled door on the back of the rack gives us a better PUE, we don't waste floor space on air handling units and we don't have to worry about getting all the air containment for hot/cold isles either.
Off the top of my head we are about 35kW per rack we the nodes are all spun up, though of course 100% utilization is super hard because we are always having nodes idling while to gather enough free for the next job to run. Nodes are about 85-90% busy.
Re: (Score:3)
So those of us that still have a lot of empty space available and don't have to change to high density yet "need the sack"?
WTF is it with the insulting zealotry? It completely demolishes credibility, making any of your points questionable whether they are correct or not.
Also you've said nothing useful about water treatment, condensation, descaling
Re: (Score:2)
Well the other plus side is that you get a better PUE so regardless of whether you have plenty of floor space you are better off with a water cooled door on the back of the rack. Especially as you will likely have several generations of equipment go through the rack.
Re: (Score:2)
Re: (Score:2)
How often do you replace a radiator in your house? How often do you for that matter replace a radiator in your car? The life span of a single generation of IT equipment is say five years maximum, so even being conservative a rack should be good for at least 15 years or three generations.
Re: (Score:2)
Based on what exactly?
I suppose I shouldn't expect much when you told the earlier poster to "go crawl back under your rock" just for asking a few questions, but please could we have a little bit more than something along the lines of "bec
Re: (Score:2)
He doesn't care because he won't be there in 15 years when other people are cleaning up his mess.
Re: (Score:2)
> The second problem is suppliers that want to sell you something prepackaged in a rack that is not water cooled. Trying to explain to them that you don't want their shitty none water cooled rack is in my experience is like taking to a brick wall. Another reason why you probably still need air handling units. That said our doors are currently providing net cooling to the room, that is inlet temperature at the front of the rack is higher than the outlet temperature at the back.
What bloody suppliers?
Only
Re: (Score:2)
What happens if the chiller break breakdown? do you have a way to take the racks down gracefully if the chillers fail? And why water and not CFC based? Since CFCs are not conductive it would seem to be a safer choice. BTW are you using tap water or de-ionized water?
Those who forget the lessons of history.... (Score:3, Interesting)
Redundancy for pumps is easy at data center scales, but the very real problem over non-trivial equipment lives is leaks.
Way back when, we used concrete pumping hoses for chilled water to CRAC units, to facilitate relocation of the units and reduce the risk of leaks after an earthquake. Worked pretty well, they had a huge safety factor compared to concrete pressures, and they were easy to test before placing into service. Seismic performance was very predictable, and it all worked well. Then, a bad (metal
Re: (Score:2)
Pictures are more helpful to explain. The piping to the water cooling:
https://images.duckduckgo.com/... [duckduckgo.com]
The cooling water pipes goes all around the back door of the rack. Water goes in one pipe and out the other. Then the heat is carried off using an air to water exchanger. What worries me is whether they are recycling the water or just wasting the mains water supply.
Re: (Score:3)
Each new generation of electronics has a higher density, so this isn't really news.
During the 70's a computer room was huge with a lot of stuff physically. Today that computing power fits in your pocket.
In addition to this we have more and more data stored electronically, so the increased density is somewhat counteracted by that.
Another factor is that earlier there were only a few terminals locally, but today we have networks giving every household a connection where a large number of terminals can reside.
Re: (Score:1)
Re: (Score:2)
Think about the plumbing in your house, or perhaps something closer
Re: (Score:2)
My first liquid cooled computer was also the first computer I built. I had to take it apart and fix leaks at least once a month - it was higher maintenance than a Japanese sports car.
You probably should have used automotive components, then. I've got coolant lines thirty years old and not leaking. (I've also had to replace coolant lines at ten years or so, but never worse than that.)
If I were trying to build liquid-cooled computer cabinets, I'd go to automotive supply for my parts and chemicals. That stuff can commonly last decades without maintenance, and you can readily order replacements. But the truth is that I've just stuck with air cooling, and not regretted it. It's not like my c
lmao (Score:1)
when did we start taking advice on how to run a data center from "gamers" with liquid cooling systems?
Pfft (Score:1)
The last international data center company I worked for had water based fire suppression.
Saved them a ton of money, at least until one of the data centers caught fire. The whole DC was a loss. When the final fire investigation was finished and the determined cause was the wiring done by an non-licensed electrician (Another cost savings) the insurance refused to pay.
So, put water cooling in the rack and stack that power density as high as you want. Just remember they are cutting every other corner to save a
Clickbait... (Score:5, Informative)
Re: (Score:2)
I don't see anything that looks like affiliate info in that link. And you can copy/paste the URL into a new browser tab so there's no referrer info, either.
Re: (Score:2)
Their marketing department shares a slashdot account, it's pretty standard practice, really.
Re:Clickbait... (Score:5, Informative)
I don't see anything that looks like affiliate info in that link. And you can copy/paste the URL into a new browser tab so there's no referrer info, either.
The submitter has submitted 24 stories over the course of the last year. Every single one of the stories links to atleast one of the same two domains, on the same subject... It is pretty obviously an affiliated/sock puppet account for an employee or marketing department.
Re: (Score:3)
Watercooled is new? (Score:3)
"A lot of them grew up with PC gaming and water cooling right in their living room,"
A lot of us grew up working on water cooled mainframes right in our own data centers.
What's that saying about 'what's old is new again...' Now get off my lawn.
Re: Watercooled is new? (Score:4, Funny)
But it's a millenial telling you about it...
Re: (Score:2)