Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Data Storage IT

Cooling Bags Could Cut Server Cooling Costs By 93% 135

judgecorp writes "UK company Iceotope has launched liquid-cooling technology which it says surpasses what can be done with water or air-cooling and can cut data centre cooling costs by up to 93 percent. Announced at Supercomputing 2009 in Portland, Oregon, the 'modular Liquid-Immersion Cooled Server' technology wraps each server in a cool-bag-like device, which cools components inside a server, rather than cooling the whole data centre, or even a traditional 'hot aisle.' Earlier this year, IBM predicted that in ten years all data centre servers might be water-cooled." Adds reader 1sockchuck, "The Hot Aisle has additional photos and diagrams of the new system."
This discussion has been archived. No new comments can be posted.

Cooling Bags Could Cut Server Cooling Costs By 93%

Comments Filter:
  • by captaindomon ( 870655 ) on Tuesday November 17, 2009 @12:15PM (#30129996)
    That's really nifty, and I'm sure it works ok and everything, but... how much does it cost?
  • Re:Ugh. (Score:2, Insightful)

    by BadAnalogyGuy ( 945258 ) <BadAnalogyGuy@gmail.com> on Tuesday November 17, 2009 @12:17PM (#30130028)

    Like the unpriced bottle of wine at Applebees. If you have to ask...

  • by Itninja ( 937614 ) on Tuesday November 17, 2009 @12:22PM (#30130086) Homepage
    Seriously. What do we do when a RAM module or a backplane fails? Will a simple hardware swap become a task for those trained in hazmat handling? I do not want to be on the help desk when someone calls and says "Help! The servers are leaking!"
  • Re:Excess Heat (Score:4, Insightful)

    by von_rick ( 944421 ) on Tuesday November 17, 2009 @12:22PM (#30130088) Homepage
    In winter you'd get quite a few kilowatt hours worth of heating if you route the dissipated heat properly.
  • by dintlu ( 1171159 ) on Tuesday November 17, 2009 @12:40PM (#30130344)

    You pull that server out of the farm and let other servers pick up the slack while you make repairs.

    It's hype, based on the assumption that every server on the planet will be virtualized by 2019, and that the separation of hardware from the software that runs on it will allow IT departments ample time to offload work into "the cloud" while they swap out RAM.

    Either that or it's made for large datacenters with multiple redundancies and enormous cooling costs. :)

  • Cray-2 (Score:4, Insightful)

    by fahrbot-bot ( 874524 ) on Tuesday November 17, 2009 @12:59PM (#30130592)

    "The Iceotope approach takes liquid - in the form of an inert synthetic coolant, rather than water - directly down to the component level," ... "It does this by immersing the entire contents of each server in a "bath" of coolant within a sealed compartment, creating a cooling module."

    Hmm... The Cray-2 [wikipedia.org] was cooled via complete immersion in Fluorinert [wikipedia.org] way back in circa 1988. I was an admin on one (Ya, I'm old). So, this is a bit different, but certainly not ground-breaking.

  • by Smidge204 ( 605297 ) on Tuesday November 17, 2009 @01:05PM (#30130664) Journal

    The idea that the mainboard components are sealed inside a liquid-filled compartment seems like a major point against the system. Extra proprietary vendor lock-in components mean extra costs of owning and operating, which probably offset any savings from cooling... if any.

    I'm skeptical that it will significantly reduce cooling costs (Compared to, say, a chilled cabinet system) because the total cooling load stays the same. If you're generating a billion BTUs of heat you still need to remove a billion BTUs of heat. Any savings will only be from the higher energy densities water allows versus air and maybe initial installation.

    Plus, based on their exploded view, there is no less than three heat exchanges before it even gets out of the cabinet: Chip to liquid (via heat sink), submersion liquid to module liquid, module liquid to system liquid. Each time to go through an exchange your temperature gradient goes up.

    What they need is a system that is compatible with commodity components to leverage low cost hardware against lower cost cooling. Why not fit water blocks directly to existing mainboard layouts and circulate chilled water from the main loop directly through them via manifolds and pump at each rack? You can still enclose the mainbaord and cooling block in a sealed, insulated compartment to eliminate condensation problems, but not being submerged means you can actually repair/upgrade the modules.
    =Smidge=

  • by tuomoks ( 246421 ) <tuomo@descolada.com> on Tuesday November 17, 2009 @02:21PM (#30131852) Homepage

    Water (and liquid coolants, even metals) can be a hassle if not deigned correctly. I have had my experiences with water cooled systems but mainly the "over efficiency", well, one burst which shouldn't have happened (LOL).

    One thing I have learned (from my son) - in cars, everything replaced with military and/or airplane grade fittings, valves, tubes, etc - makes life much easier. Not much more expensive but very fast pays back. If I would have known that (much) earlier instead of accepting engineering (good enough) / accounting (cheap enough), my life would have been easier but maybe it's a learning process?

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...