Facebook Suffers Actual Cloud In Oregon Datacenter 83
An anonymous reader writes "The Register carries the funniest, most topical IT story of the year: 'Facebook's first data center ran into problems of a distinctly ironic nature when a literal cloud formed in the IT room and started to rain on servers. Though Facebook has previously hinted at this via references to a 'humidity event' within its first data center in Prineville, Oregon, the social network's infrastructure king Jay Parikh told The Reg on Thursday that, for a few minutes in Summer, 2011, Facebook's data center contained two clouds: one powered the social network, the other poured water on it.'"
Re: (Score:1)
Yup, that's Oregon.
Obligitory (Score:5, Insightful)
And nothing of value was lost.
Re: (Score:2)
Re:Obligitory (Score:5, Funny)
And nothing of value was lost.
Various US intelligence agencies and their chums in other countries beg to differ.
Re: (Score:1)
I'm pretty sure those servers and power supplies were valuable, regardless of the data they host (which is also of value to someone).
Re: (Score:2)
Some insurance company will pay for them I am sure...and some AC tech might lose his job...
Re: (Score:1)
And nothing of value was lost.
Oh how absolutely true
Where are the Pics? (Score:5, Insightful)
I dont se any pics in the linked article, Someone has to have pictures of this if it happened...
Re:Where are the Pics? (Score:4, Informative)
This is one of those RFA to get to the RA type stories.
The link next to the quote is the one you want :-
http://www.opencompute.org/2011/11/17/learning-lessons-at-the-prineville-data-center/ [opencompute.org]
Re:Where are the Pics? (Score:5, Insightful)
saw that and I think the issue is that the sudden humidity change caused condensation, not terribly uncommon if prompt action isnt taken upon AC failure in a humid climate...I don't see a "Cloud in the room"
Hype to sell newspapers...and link bait...
Re: (Score:1)
Prineville is high desert and is very dry.
Re: (Score:2)
Given that it was a condensing environment to the point that even the power supplies (typically hotter than the incoming air) got wet enough to short out, yes a cloud formed in the data center. There would have to have been actual droplets of water in the air.
Re: (Score:2)
Re: (Score:3)
That is one of the funniest things I have read for ages!
Power supplies are dealing with 700+ watts at maybe 90% efficiency, and you describe them as "pretty cool"? Bollocks! Capacitors and Inductors have current flowing and series equivalent resistance, which certainly generates heat. They *should* certainly be hotter than ambient, which is the most important factor in condensation.
If I saw condensation on powered electronics components I would be looking for the ladder to climb out of the pool.
Re: (Score:1)
there are - those photos are just private
Re: (Score:2)
Or better, videos. :)
Obligatory (Score:5, Funny)
Welcome to Oregon, it rains a lot.
Re:Obligatory (Score:5, Informative)
Sure, if you think 10.4 inches [usclimatedata.com] yearly average is a lot. East side of the state's actually quite arid; the west side is quite soggy in the Coast Range and seaside but the Willamette Valley where most of the population lives isn't exceptionally rainy, it's that it's subject to never-ending spells of overcast weather; other parts of the country actually have higher annual precipitation.
Re: (Score:1)
http://upload.wikimedia.org/wikipedia/commons/8/80/Oregon_Average_Annual_Precipitation_(1961-1990)_Map.png [wikimedia.org]Another citation
Clearly OP is from western oregon, where it is very rainy indeed.
Re: (Score:2)
"and yes, I am a native of DC."
I've never met anyone who was willing to admit that! Oh, wait - Anonymous Coward? I still haven't met anyone who is willing to admit that he is a native of the District of Columbia.
Re: (Score:3, Interesting)
Welcome to Oregon, it rains a lot.
From the Article
This resulted in cold aisle supply temperature exceeding 80F and relative humidity exceeding 95%. The Open Compute servers that are deployed within the data center reacted to these extreme changes. Numerous servers were rebooted and few were automatically shut down due to power supply unit failure.
WTF 80 deg F (approx 27 deg C) is quite warm in a Data-centre especially in a "cold aisle" and 95% humidity is criminal.
Facebook learned from the mistakes, and now designs its servers with a seal around their power supply, or as Parikh calls it, "a rubber raincoat."
When designing a Data-centre you have to plan for a certain temperature range that the equipment you have inside operates optimally. In addition you have to keep the humidity within manufacture recommended limits since too low results in static electricity and too high well you could get condensation on the electrical equipment. Rubber seals may protect power supplies altho
Re: (Score:2)
Do you actually know how the Facebook Oregon datacenter works?
It doesn't *have* HVAC. The building *is* HVAC: The entire building is one very very large HVAC unit.
HVAC units have 'efficiency of scale'; as you build them larger they get more efficient. Thats what Facebook was aiming at. Unfortunately its a relatively new development in building and datacenter design and clearly has some bugs to work out!
Re: (Score:3, Informative)
WTF 80 deg F (approx 27 deg C) is quite warm in a Data-centre especially in a "cold aisle" and 95% humidity is criminal.
You're used to classic datacentres, where the goal was "shove as much cold air into them as possible", i.e. "the lower the temperature the better". It all depends on how the datacentre was built, how its cooling system is/was engineered, and an almost indefinite number of variables. References for you to read (not skim) -- the study in the PDF will probably interest you the most:
http://www.datacenterknowledge.com/archives/2011/03/10/energy-efficiency-guide-data-center-temperature/ [datacenterknowledge.com]
http://www.geek.com/chips [geek.com]
Re: (Score:2)
Re: (Score:2)
Most of the servers I have seen default to throttling the fans based on temperature. That included 1U systems where the fans are controlled by CPU temperature. Perhaps some enterprises run the fans full speed, but others *INSIST* on throttling in the datacenter. I prefer to run them at full throttle, but do acknowledge that in an im[perfect world, after a few years there WILL be a few dead fans being actively ignored.
In a cluster, you can actually hear it when a job is submitted, it sounds a bit like a jet
Re: (Score:1)
I concur...
Re GPP: " Nuvoton/Winbond Super I/O chips" these sound like cheap chips, I don't think many of my servers uses these. The servers under high temp and high CPU load do indeed have sensors to throttle/control fans to increase/decrease airflow, they also have a fail-safe mechanism that is the sensor breaks the falls run at full speed. On top of all this the server unit has a motherboard base thermal shutdown that turns the whole server off if the ambient temperature is over some limit.
I have had t
Re: (Score:2)
Re: (Score:2)
Only on the western third. Prineville is, in fact, on the northwestern edge of the Great Basin Desert, and in fact gets about 11 inches a year of rainfall.
By comparison, Phoenix, Arizona, gets 9.
Lightning in that particular server room (Score:1)
as well and Ill start believing in a just $deity
Re: (Score:1)
There were some power units failing. I can imagine that this happened with sparks, which are sort of small lightning.
Maybe Berndnaut Smilde snuck in: (Score:5, Interesting)
http://www.mnn.com/lifestyle/arts-culture/stories/artist-creates-beautiful-indoor-clouds [mnn.com]
Re: (Score:2)
If you take it to the extreme you get a security system:
https://www.youtube.com/watch?v=cOgKti335tQ [youtube.com]
https://www.youtube.com/watch?v=cAPw_xbTJzk [youtube.com]
https://www.youtube.com/watch?v=vVf3s6PtFkw [youtube.com]
Tech Support (Score:2)
Caller: "There's a cloud in our cloud, come immediately!"
Support: "Speak loud, I cannot hear you."
Caller: "No, it's cloudy in our cloud."
Support: "Yes, speak loud."
Caller: "Yes, there is a big cloud."
Support: "Yes, you must speak loud, that's what I said."
Caller: "You must have a cloud also, nothing's making sense. Let's try this: bring some sun."
Support: "Come soon?"
Caller: "Yeah, that too. Come soon with sun."
Support: "I can't hear you, my connection is cloudy."
Re: (Score:3)
What happened was that the system malfunctioned which led to hot and humid air being circulated throughout the system. This normally would not cause condensation. However, all of the equipment was previously cold (because the system was working normally before it failed). The hot and humid air came into contact with cold components (various components in the power supply and computer casing). This caused condensation (because the hot and humid air contacted the cold components
Streamed Hams (Score:2, Funny)
Superintendant Chalmers: A rain storm? At this time of year? At this time of day? In this part of the country? Localized entirely within your datacentre?
Re: (Score:1)
it's supernintendo chalmers... you insensitive clod!
Re: (Score:2)
happened before (Score:2)
in nasa's vehicle assembly building
There are at least 3 clouds then. (Score:4, Interesting)
The first cloud would be the humidity and condensation sort. The second cloud would be the online service itself. The third cloud, would be the open Internet between the endpoints in a network graph. [infront.com]
What do all these clouds have in common? They're dangerous. The less clouds in your diagram the more you know about your network architecture, latency, and data integrity. The less clouds the better! When a packet goes into the shroud of the cloud in the diagram there's a much higher chance we'll never see it again. This cloud is the one where we must encrypt our data and protect against spoofing and hacking and all forms of data manipulation and latency. The receiving end must be very careful to sanitize the inputs and verify the requests vigorously all because the packet has encountered the cloud. Likewise if we want to interact with an online "cloud" service, we shift the name packet to "our stuff" our login credentials and even bank account info, we have to worry about availability and bandwidth caps when streaming, and unwanted prying eyes from folks we may not desire to have looking, everything becomes far more risky because our stuff touched the cloud service; Far more risky than physically going to the bank or visiting a friend in person would not be subject to. If someone hacks the ATM, the entire bank doesn't lose everyone's credentials. As for the mist filled variety of cloud: It can not only get wet, but if you have a big enough cloud, it can strike you with lightning. We must have surge protections and battery backups against this cloud too.
When I hear people talking about embracing the "cloud" I cringe. "To The Cloud!", in my mind means, "Danger Will Robinson!"
As somebody working on building energy topics (Score:2)
That this happens shows me that they realy optimize their air conditioning for energy consumption.
Traditionally the approach would have been: "Dont think, cool down and re-heat the air constantly to dehumidify it sufficiently". So traditionally you do this dumb with a lot of energy, even if its not needed at all times. What we probably see there is that some control could not (predict or) handle some drop in the inner load (electrial power) in the data center.
Re:As somebody working on building energy topics (Score:4, Interesting)
That this happens shows me that they realy optimize their air conditioning for energy consumption.
Traditionally the approach would have been: "Dont think, cool down and re-heat the air constantly to dehumidify it sufficiently". So traditionally you do this dumb with a lot of energy, even if its not needed at all times. What we probably see there is that some control could not (predict or) handle some drop in the inner load (electrial power) in the data center.
The Facebook Oregon datacenter doesn't 'have' air conditioning.
The building is an 'air conditioner'. Its an experimental design...
Re: (Score:2)
Right. I said "air conditioniong" i did not say "air conditioner".
Re: (Score:2)
I work for another data center operator in Central Oregon. We also use ambient air cooling. I don't know if we use the same system Facebook does, to be honest.
Central Oregon is the northwestern edge of the Great Basin desert. Summers here are bone dry. Our data center gets so dry we actually have the opposite problem: it gets TOO dry.
Re: (Score:2)
Alright - what happens in a data center when it gets "TOO dry"? I would assume that people entering the building would become static electricity hazards. It would become essential that anyone handling or touching equipment must use a grounding strap. Anything else?
Re: (Score:2)
Re: (Score:3)
Not just the people, but yes, static electricity is the primary concern. Also, I'm told by the people that manage these sorts of things that a "too dry" environment also makes air cooling less effective. Something to do with the fact that a little bit of moisture actually allows the air to carry more heat than if it was 100% dry.
Re: (Score:2)
Ahhh - never thought of that. Makes sense to anyone who has ever had cold survival training. A humid atmosphere leeches warmth from a human body much faster than a dry atmosphere, all other things being equal.
Re: (Score:2)
Actually, traditionally, the cooling and the reheat would each be cycled/modulated by the thermostat/humidistat, not run constantly
The mistake in this case was not accounting for changes in temperature and humidity, including the fact that the dewpoint temperautre of the air can change much more rapidly than the temperature of the solid objects in the room. It really was a boneh
Heat exchange (Score:1)
I'm thinking there has to have been great heat exchange in a system like this.
I severely doubt (Score:2)
That the roof was high enough to actually allow the formation of clouds. Also, I believe you need dust in the air to form clouds, and I would think that their would be a lot less than normal in a server room. Intense condensation on the roof causing rain is another thing.
Re: (Score:2)
There are others...
http://en.wikipedia.org/wiki/Airship_hangar [wikipedia.org]
No, really, WTF? (Score:3)
Ok, I work for a data center operator. In Central Oregon.
Our data center is so damn dry that most of the time in the summer we're getting alerts about the humidity being too low. How did Facebook fuck this one up?
Phone Conversation (Score:1)
Phone conversation between two data center techs:
Tech 1: "There's a cloud in the Facebook datacenter!"
Tech 2: "So? Facebook is built on cloud technology!"
Tech 1: "No I mean a real cloud!"
Tech 2: "Facebook is built on a server cloud architecture. It IS a real cloud you idiot!"
Tech 1: "There is a real cloud with real rain in the data center you geek retard! Its shutting down the servers!"
Tech 2: "Servers shutting down? Maybe the rainfall service is flooding the network with raindrop packets? That would be an
It's clouds (Score:5, Funny)
I warned them ... (Score:2)
Load balancing (Score:5, Funny)
Re: (Score:2)
You know your datacenter is too big... (Score:2)
...When it develops its own atmospheric systems. (that include the water cycle)
This actually happened to me around 1990 (Score:2)
In the computer section of the UMass Dartmouth Library back around 1990, we had a really humid day one spring/summer inside the Library. There is a part of the library that is open several stories tall and glass windows let the light shine in. We actually had a few drops fall within the library that day. Luckily, no computers were harmed in this event. Super strange to be in the middle of one of these events, that's for sure.
And data recovery was ..... (Score:1)