The 100 Degree Data Center 472
miller60 writes "Are you ready for the 100-degree data center? Rackable Systems has introduced a new enclosure that it says can run high-density racks safely in environments as hot as 104 degrees (40 degrees C), offering customers the option of saving energy in their data center. Most data centers operate in a range between 68 and 74 degrees. Raising the thermostat can lower the power bill, allowing data centers to use less power for cooling. But higher temperatures can be less forgiving in the event of a cooling failure, and not likely to be welcomed by employees working in the data center."
Re:Were nerds here... use the f'ing metric system (Score:1, Informative)
but in america even among nerds it's uncommon to use metric when referring to room temperatures. i think most of us could not tell you whether 20C would be a comfortable temperature vs. 40C without piping it through google.
Re:Were nerds here... use the f'ing metric system (Score:5, Informative)
Actually, this is an American site, so use something that most Americans can intuitively relate to. I have no problem working with most metric measurements (indeed, I did so for a number of years working in machining) but temperature just doesn't compute for me unless I do the calculations in my head.
Fahrenheit just makes more sense to most of us. 30s = cold, 40s = chilly, 50s = cool, 60s = decent/might need a windbreaker, 70s = nice, 80s = warm, 90s = hot, etc, etc. Celsius is no where near that intuitive and was as arbitrarily defined as Fahrenheit was.
Re:Were nerds here... use the f'ing metric system (Score:1, Informative)
The zero of Fahrenheit -- the freezing point of saturated brine -- is no less sensible than the Celcius zero of the freezing point of water. Fahrenheit is also more precise with fewer digits in the ranges most people deal with day to day.
Re:Drives (Score:3, Informative)
It's really not so bad. Most drives are rated to about 55deg C (131F), 104F is only 40C.
The key is to design the server with sufficient airflow to try and keep the temperature of the components close to the room's temperature.
Looking at the Datasheet [rackable.com], it looks like they are running the servers on DC power. That way, each server doesn't have it's own power supply, they just hook up to a separate power unit elsewhere in the rack.
The servers don't seem to have fans either. The fans are in the cabinet door.
This setup reminds me of the description of Google's search cluster racks I saw somewhere.
This could result in huge savings. I remember some Sun data center guy talking about one of their new data centers and how they were able to run it at 74F. He said each deg F the could keep the temperature up resulted in 4% power savings.
Re:Were nerds here... use the f'ing metric system (Score:4, Informative)
Only at standard temperature and pressure...
Besides, at Zero, shouldn't there be NO thermal energy? You standard of +273K = 0C seems pretty arbitrary to me!
Real geeks use Kelvin.
Re:Use that waste heat! (Score:1, Informative)
If we had hot water pipes on the street it would be cold water pipes by the time it got anywhere. Which is why we don't.
Re:heat exchangers in the data centre (Score:2, Informative)
It's not like water cooling is new.
Re:Northern data centers (Score:3, Informative)
Re:Were nerds here... use the f'ing metric system (Score:2, Informative)
The zero of Fahrenheit -- the freezing point of saturated brine -- is no less sensible than the Celcius zero of the freezing point of water. Fahrenheit is also more precise with fewer digits in the ranges most people deal with day to day.
Yeah, because I'm always having to deal with saturated brine. I can't tell you how many times I've gone out driving in sub-zero temperatures and nearly skidded on all that saturated brine ice.
It was developed in a port city where knowing if the harbor was frozen over (or not) was in fact of great importance.
Re:Were nerds here... use the f'ing metric system (Score:1, Informative)
0 is the temperature at which ice melts, not at which water freezes. There is a difference.
Re:Were nerds here... use the f'ing metric system (Score:0, Informative)
OK, who modded this Flamebait?
This is clearly either a Troll or Redundant. The Flamebait mod is completely unfair.
Re:Were nerds here... use the f'ing metric system (Score:3, Informative)
"The key to converting to metric is establishing new reference points"
http://xkcd.com/526/ [xkcd.com]
"-40C : spit goes 'clink'"
It takes 1 joule to heat 1 cubic cm. one degree C. (Score:3, Informative)
The metric system is unified in all directions, time, mass, length, temperature, energy etc...
The system makes sense instead of relying on the length on the king's thumb, foot and arm, or the weight of a stone or the amount of work being done by a horse, all variable and inconsistent.
Its one of the many things we owe the French under Napoleon, like a unified system of laws, the "Code Napoleon."
Re:Were nerds here... use the f'ing metric system (Score:3, Informative)
Otherwise known as Muphry's Law.
"Muphry's Law is an adage that states that 'if you write anything criticizing editing or proofreading, there will be a fault of some kind in what you have written'."
http://en.wikipedia.org/wiki/Muphry's_law [wikipedia.org]
Re:Were nerds here... use the f'ing metric system (Score:4, Informative)
Ahh yes, the sea, filled with saturated brine.
Sea water is ~26% salt. Which is why it is impossible to put any more salt in there, it'll just gently float to the bottom.
What's that?? It's only actually ~50 ppt salt? And saturated brine freezes at around -7F/-23C? And harbour sea water normaly freezes at around -2C?
Poppycock.
Re:Were nerds here... use the f'ing metric system (Score:4, Informative)
I don't even know how many ml there are in a tablespoon.
5ml per teaspoon, or close enough. So 15ml in 1 TBSP.
Once you start doling out liquid medicine for kids, this one's easy. :-)
Let's run the numbers (Score:3, Informative)
May I point out the obvious: not only the higher power consumption comes from increased leakage currents in the silicon, but it also comes from the fact that power supplies are less efficient at higher temperatures, so they need to pull more current from the wall socket to maintain the same output current.
However what you and I just said is irrelevant. As your graph shows, the difference in power consumption is very minimal: 2% for each 10C due to leakage currents, and maybe ~5% for each 10C in decreased PSU efficiency. These few percentage points are nothing compared to the amount of power you would save by making the AC work less hard. Indeed, if without AC the datacenter would reach 140F (333 Kelvin), cooling it down to 60F (289 Kelvin) requires removing 44 Kelvin of heat, whereas cooling it down to 100F (311 Kelvin) only requires removing 22 Kelvin of heat, therefore running it at 100F would roughly reduce the AC power consumption by 50% ! So the point made by TFA still holds: overall you still are saving energy by running a whole datacenter at a 10C higher temperature.
As to the higher component failure rate: as it was proven by 2 independent studies last year (Google and CMU), higher temperatures do not even correlate with higher hdd failure rates. In fact, strangely they observed a slight reverse effect: hdd tended to fail less often !