Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Data Storage IT

Cooling Bags Could Cut Server Cooling Costs By 93% 135

judgecorp writes "UK company Iceotope has launched liquid-cooling technology which it says surpasses what can be done with water or air-cooling and can cut data centre cooling costs by up to 93 percent. Announced at Supercomputing 2009 in Portland, Oregon, the 'modular Liquid-Immersion Cooled Server' technology wraps each server in a cool-bag-like device, which cools components inside a server, rather than cooling the whole data centre, or even a traditional 'hot aisle.' Earlier this year, IBM predicted that in ten years all data centre servers might be water-cooled." Adds reader 1sockchuck, "The Hot Aisle has additional photos and diagrams of the new system."
This discussion has been archived. No new comments can be posted.

Cooling Bags Could Cut Server Cooling Costs By 93%

Comments Filter:
  • Quick Release (Score:5, Informative)

    by srealm ( 157581 ) <prez.goth@net> on Tuesday November 17, 2009 @12:45PM (#30130408) Homepage

    The problem with all this is you need a good piping and plumbing system in place, complete with quick release valves to ensure you can disconnect or connect hardware without having to do a whole bunch piping and water routing in the process. Part of the beauty of racks is you just slide in the computer, screw it in, and plug in the plugs at the back and you're done.

    I'm not saying it's impossible, but just building a new case, or blade, or whatever isn't going to do it - you need a new rack system with built in pipes and pumps, and probably a data center with even more plumbing with outlets at the appropriate places to supply each rack with water. This is no small task for trying to retrofit an existing data center.

    Not to mention that you have to make sure you have enough pressure to ensure each server is supplied water from the 'source', you cannot just daisy chain computers because the water would get hotter and hotter the further down the chain you go. This means a dual piping system (one for 'cool or room temperature' water and one for 'hot' water). And it means adjusting the pressure to each rack depending on how many computers are in it and such.

    The issues of water cooling a data center go WAY beyond the case, which is why nobody has really done it yet - sure, the cost savings are potentially huge, but it's a LOT more complicated that sticking a bunch of servers with fans in racks that can move around and such, and then turning on the A/C. And there is a lot less room for error (as someone else mentioned, what if a leak occurs? or a plumbing joint fails, or whatever. Hell, if a pump fails you could be out a whole rack!).

  • by jaggeh ( 1485669 ) on Tuesday November 17, 2009 @12:54PM (#30130498)

    That's really nifty, and I'm sure it works ok and everything, but... how much does it cost?

    Figures cited by Iceotope show that the average air-cooled data centre with around 1000 servers costs around $788,400 (£469,446) to cool over three years. The Iceotope system claims to eliminate the need for CRAC units and chillers by connecting the servers in the synthetic cool bags to a channel of warm water that transfers the heat outside the facility. This so-called “end to end liquid” cooling means that a data centre, fully equipped with Iceotope-cooled servers, could cut cooling costs to just $52,560 - a 93 percent reduction, the company states.

    taking the above figures into account as long as the cost to install is under the 200k figure theres an incentive to switch

  • Water is a hassle (Score:5, Informative)

    by BlueParrot ( 965239 ) on Tuesday November 17, 2009 @12:56PM (#30130536)

    I work with particle accelerators that draw enough power that we don't have much choice but to use water cooling, and even though we have major radiation sources, high voltage running across the entire place, liquid helium cooled magnets, high power klystrons that feed microwaves to the accelerator cavities etc... the only thing that typically requires me to place an emergency call during a night shift is still water leaks.

    Water is just that much of a hassle around electronics. Even an absolutely minor leak can raise the humidity in a place you really don't want humidity, it evaporates and then condenses on the colder parts of the system where even a single drop can cause a short circuit and fry some piece of equipment. After it absorbs dirt and dust from the surroundings it starts attacking most materials corrosively, which may not be noticed at first but gives sudden unexpected problems after a few years. If you don't keep the cooling system itself in perfect condition valves and taps will start corroding and you get blockages. Maintenance is a pain because you have to power everything down if you want to move just 1 pipe etc...

    I just don't see why you would go through the hassle with water cooling unless you actually have to, and quite frankly if your servers draw enough power to force you to use water for cooling then you're doing something weird.

  • Cray XT5 "Jaguar" (Score:1, Informative)

    by Anonymous Coward on Tuesday November 17, 2009 @01:17PM (#30130794)
    The #1 on the top 500 supercomputer list [cray.com] is using water cooling as well (in combination with phase change cooling). Watercooling whole racks can be done. The only difference from TFA is that is also adds immersion cooling [pugetsystems.com]. Immersion cooling has been found to be superior in cooling but comes with (obvious) considerable maintenance problems. The video [cray.com] for this machine shows more or less standard water cooling blocks on the processors, along with various plumbing that to keeps the machine chilled.
  • by osu-neko ( 2604 ) on Tuesday November 17, 2009 @03:35PM (#30133220)

    Ok, so they are British and they spell 'center' with the 'er' the other way around. Why don't they spell server as 'servre'?

    First of all, it would be 'serveur', not 'servre'. And its use is too new to be one of those words in which the french spelling is retained from the days of the Normans. Incidentally, even in America, we fail to reverse the 'er/re' in some words, consider 'acre', 'massacre', and 'mediocre', so we're not exactly consistent either...

  • by DrYak ( 748999 ) on Tuesday November 17, 2009 @05:48PM (#30135628) Homepage

    No spill (as in "almost insignificant", not as in "not too much, won't empty the whole system, but you better have some towel nearby just in case"), quick disconnect, low resistance valves for watercooled system have been already available for quite some time for enthusiasts.

    (Koolance is an example of compagny producing such thing in the US, Aquatuning is an example of shop selling similar implements in the EU - no links to avoid gratuitous advertising to web spiders, but you can easily google the names).

    Anyway, low conductance liquids are popular in application where spills and leaks aren't easily monitored (see above source). And don't forget that every other blade module is sealed too. So in case of leak you're just spilling... on a sealed container which isn't affected by external liquids anyway.

    As for pressure : Well, uh, no. You would need tremendous pressure if you had to fill the whole rack using 1 single pump. Which would be a single point of failure and is bad.
    The more sensible approach would be each blade module having its own small pump (Laing DDC for the win !!!) for pumping water out of the rack's main tank.

    It's already the scenario used in most rack-cooling situation (see again mentioned sources above). And in case of pump failure, well, only 1 blade module fails. The rest of the rack is immune to it.

    Well I'm sure most /.er have some ricer friend (the kind which custom hand compile gentoo with "-O9999" :-) ) to whom a massive failure of watercooling has happened some time ago. Watercooling safety has evolved since then and it's now much more secure even for simple enthusiast. Now, a company specializing into data-centers has even more possibility to offer safety.

  • Re:Cray-2 (Score:3, Informative)

    by jbengt ( 874751 ) on Tuesday November 17, 2009 @07:19PM (#30136942)

    The crays full immersion coolant model hit a big problem - the coanda effect.

    You did not describe the Coanda effect, you described boundary layer issues. I don't know enough about the story to know whether boundary layers were the real issue: It's pretty routine to take into account the fact that friction causes the fluid velocity to approach zero at the surface of a stationary object, and to account for a lack of turbulence in the laminar part of the boundary layer that reduces heat transfer. It could just be that they needed to get a phase change at the surface in order to pull away enough heat.

    By the way, the Coanda effect is the property that causes cold air being thrown horizontally out of a ceiling diffuser to stick to the ceiling rather than fall in cold drafts. The air stream is squeezed by the supply opening, and therefore has to speed up to conserve mass flow. It gets that extra velocity by losing pressure, so the air supplied is at a lower air pressure than the room air, and, as a consequence, is pushed up against the ceiling until it mixes with the warmer room air and slows down.

"And remember: Evil will always prevail, because Good is dumb." -- Spaceballs

Working...