Forgot your password?
typodupeerror
Cloud Facebook IT

Facebook Suffers Actual Cloud In Oregon Datacenter 83

Posted by timothy
from the but-the-cloud-is-the-computer dept.
An anonymous reader writes "The Register carries the funniest, most topical IT story of the year: 'Facebook's first data center ran into problems of a distinctly ironic nature when a literal cloud formed in the IT room and started to rain on servers. Though Facebook has previously hinted at this via references to a 'humidity event' within its first data center in Prineville, Oregon, the social network's infrastructure king Jay Parikh told The Reg on Thursday that, for a few minutes in Summer, 2011, Facebook's data center contained two clouds: one powered the social network, the other poured water on it.'"
This discussion has been archived. No new comments can be posted.

Facebook Suffers Actual Cloud In Oregon Datacenter

Comments Filter:
  • Obligitory (Score:5, Insightful)

    by Anonymous Coward on Sunday June 09, 2013 @01:41AM (#43950553)

    And nothing of value was lost.

  • by anthony_greer (2623521) on Sunday June 09, 2013 @01:41AM (#43950555)

    I dont se any pics in the linked article, Someone has to have pictures of this if it happened...

    • by Wild Wizard (309461) on Sunday June 09, 2013 @01:51AM (#43950609) Journal

      This is one of those RFA to get to the RA type stories.

      The link next to the quote is the one you want :-
      http://www.opencompute.org/2011/11/17/learning-lessons-at-the-prineville-data-center/ [opencompute.org]

      • by anthony_greer (2623521) on Sunday June 09, 2013 @01:55AM (#43950625)

        saw that and I think the issue is that the sudden humidity change caused condensation, not terribly uncommon if prompt action isnt taken upon AC failure in a humid climate...I don't see a "Cloud in the room"

        Hype to sell newspapers...and link bait...

        • by Anonymous Coward

          Prineville is high desert and is very dry.

        • by sjames (1099)

          Given that it was a condensing environment to the point that even the power supplies (typically hotter than the incoming air) got wet enough to short out, yes a cloud formed in the data center. There would have to have been actual droplets of water in the air.

    • by Anonymous Coward

      there are - those photos are just private

    • by antdude (79039)

      Or better, videos. :)

  • Obligatory (Score:5, Funny)

    by identity0 (77976) on Sunday June 09, 2013 @01:46AM (#43950585) Journal

    Welcome to Oregon, it rains a lot.

    • Re:Obligatory (Score:5, Informative)

      by Ol Biscuitbarrel (1859702) on Sunday June 09, 2013 @01:54AM (#43950619)

      Sure, if you think 10.4 inches [usclimatedata.com] yearly average is a lot. East side of the state's actually quite arid; the west side is quite soggy in the Coast Range and seaside but the Willamette Valley where most of the population lives isn't exceptionally rainy, it's that it's subject to never-ending spells of overcast weather; other parts of the country actually have higher annual precipitation.

    • Re: (Score:3, Interesting)

      by donaldm (919619)

      Welcome to Oregon, it rains a lot.

      From the Article

      This resulted in cold aisle supply temperature exceeding 80F and relative humidity exceeding 95%. The Open Compute servers that are deployed within the data center reacted to these extreme changes. Numerous servers were rebooted and few were automatically shut down due to power supply unit failure.

      WTF 80 deg F (approx 27 deg C) is quite warm in a Data-centre especially in a "cold aisle" and 95% humidity is criminal.

      Facebook learned from the mistakes, and now designs its servers with a seal around their power supply, or as Parikh calls it, "a rubber raincoat."

      When designing a Data-centre you have to plan for a certain temperature range that the equipment you have inside operates optimally. In addition you have to keep the humidity within manufacture recommended limits since too low results in static electricity and too high well you could get condensation on the electrical equipment. Rubber seals may protect power supplies altho

      • Do you actually know how the Facebook Oregon datacenter works?

        It doesn't *have* HVAC. The building *is* HVAC: The entire building is one very very large HVAC unit.

        HVAC units have 'efficiency of scale'; as you build them larger they get more efficient. Thats what Facebook was aiming at. Unfortunately its a relatively new development in building and datacenter design and clearly has some bugs to work out!

      • Re: (Score:3, Informative)

        by Anonymous Coward

        WTF 80 deg F (approx 27 deg C) is quite warm in a Data-centre especially in a "cold aisle" and 95% humidity is criminal.

        You're used to classic datacentres, where the goal was "shove as much cold air into them as possible", i.e. "the lower the temperature the better". It all depends on how the datacentre was built, how its cooling system is/was engineered, and an almost indefinite number of variables. References for you to read (not skim) -- the study in the PDF will probably interest you the most:

        http://www.datacenterknowledge.com/archives/2011/03/10/energy-efficiency-guide-data-center-temperature/ [datacenterknowledge.com]
        http://www.geek.com/chips [geek.com]

        • 80f does cause the server fans to work harder via higher RPMs though. The higher your temps, the less margin of error you have to make corrections in a DC.

    • by faedle (114018)

      Only on the western third. Prineville is, in fact, on the northwestern edge of the Great Basin Desert, and in fact gets about 11 inches a year of rainfall.

      By comparison, Phoenix, Arizona, gets 9.

  • by Anonymous Coward

    as well and Ill start believing in a just $deity

    • There were some power units failing. I can imagine that this happened with sparks, which are sort of small lightning.

  • Caller: "There's a cloud in our cloud, come immediately!"

    Support: "Speak loud, I cannot hear you."

    Caller: "No, it's cloudy in our cloud."

    Support: "Yes, speak loud."

    Caller: "Yes, there is a big cloud."

    Support: "Yes, you must speak loud, that's what I said."

    Caller: "You must have a cloud also, nothing's making sense. Let's try this: bring some sun."

    Support: "Come soon?"

    Caller: "Yeah, that too. Come soon with sun."

    Support: "I can't hear you, my connection is cloudy."

  • by Anonymous Coward

    Superintendant Chalmers: A rain storm? At this time of year? At this time of day? In this part of the country? Localized entirely within your datacentre?

  • in nasa's vehicle assembly building

  • The first cloud would be the humidity and condensation sort. The second cloud would be the online service itself. The third cloud, would be the open Internet between the endpoints in a network graph. [infront.com]

    What do all these clouds have in common? They're dangerous. The less clouds in your diagram the more you know about your network architecture, latency, and data integrity. The less clouds the better! When a packet goes into the shroud of the cloud in the diagram there's a much higher chance we'll never see it again. This cloud is the one where we must encrypt our data and protect against spoofing and hacking and all forms of data manipulation and latency. The receiving end must be very careful to sanitize the inputs and verify the requests vigorously all because the packet has encountered the cloud. Likewise if we want to interact with an online "cloud" service, we shift the name packet to "our stuff" our login credentials and even bank account info, we have to worry about availability and bandwidth caps when streaming, and unwanted prying eyes from folks we may not desire to have looking, everything becomes far more risky because our stuff touched the cloud service; Far more risky than physically going to the bank or visiting a friend in person would not be subject to. If someone hacks the ATM, the entire bank doesn't lose everyone's credentials. As for the mist filled variety of cloud: It can not only get wet, but if you have a big enough cloud, it can strike you with lightning. We must have surge protections and battery backups against this cloud too.

    When I hear people talking about embracing the "cloud" I cringe. "To The Cloud!", in my mind means, "Danger Will Robinson!"

  • That this happens shows me that they realy optimize their air conditioning for energy consumption.

    Traditionally the approach would have been: "Dont think, cool down and re-heat the air constantly to dehumidify it sufficiently". So traditionally you do this dumb with a lot of energy, even if its not needed at all times. What we probably see there is that some control could not (predict or) handle some drop in the inner load (electrial power) in the data center.

    • by myowntrueself (607117) on Sunday June 09, 2013 @05:41AM (#43951275)

      That this happens shows me that they realy optimize their air conditioning for energy consumption.

      Traditionally the approach would have been: "Dont think, cool down and re-heat the air constantly to dehumidify it sufficiently". So traditionally you do this dumb with a lot of energy, even if its not needed at all times. What we probably see there is that some control could not (predict or) handle some drop in the inner load (electrial power) in the data center.

      The Facebook Oregon datacenter doesn't 'have' air conditioning.

      The building is an 'air conditioner'. Its an experimental design...

      • by drolli (522659)

        Right. I said "air conditioniong" i did not say "air conditioner".
         

      • by faedle (114018)

        I work for another data center operator in Central Oregon. We also use ambient air cooling. I don't know if we use the same system Facebook does, to be honest.

        Central Oregon is the northwestern edge of the Great Basin desert. Summers here are bone dry. Our data center gets so dry we actually have the opposite problem: it gets TOO dry.

        • Alright - what happens in a data center when it gets "TOO dry"? I would assume that people entering the building would become static electricity hazards. It would become essential that anyone handling or touching equipment must use a grounding strap. Anything else?

          • Would hard drives work in very low humidity? Rubber mounting components (if any) in the servers might start crumbling down.
          • by faedle (114018)

            Not just the people, but yes, static electricity is the primary concern. Also, I'm told by the people that manage these sorts of things that a "too dry" environment also makes air cooling less effective. Something to do with the fact that a little bit of moisture actually allows the air to carry more heat than if it was 100% dry.

            • Ahhh - never thought of that. Makes sense to anyone who has ever had cold survival training. A humid atmosphere leeches warmth from a human body much faster than a dry atmosphere, all other things being equal.

    • by jbengt (874751)

      Traditionally the approach would have been: "Dont think, cool down and re-heat the air constantly to dehumidify it sufficiently"

      Actually, traditionally, the cooling and the reheat would each be cycled/modulated by the thermostat/humidistat, not run constantly
      The mistake in this case was not accounting for changes in temperature and humidity, including the fact that the dewpoint temperautre of the air can change much more rapidly than the temperature of the solid objects in the room. It really was a boneh

  • I'm thinking there has to have been great heat exchange in a system like this.

  • That the roof was high enough to actually allow the formation of clouds. Also, I believe you need dust in the air to form clouds, and I would think that their would be a lot less than normal in a server room. Intense condensation on the roof causing rain is another thing.

  • by faedle (114018) on Sunday June 09, 2013 @11:03AM (#43952595) Homepage Journal

    Ok, I work for a data center operator. In Central Oregon.

    Our data center is so damn dry that most of the time in the summer we're getting alerts about the humidity being too low. How did Facebook fuck this one up?

  • Phone conversation between two data center techs:

    Tech 1: "There's a cloud in the Facebook datacenter!"

    Tech 2: "So? Facebook is built on cloud technology!"

    Tech 1: "No I mean a real cloud!"

    Tech 2: "Facebook is built on a server cloud architecture. It IS a real cloud you idiot!"

    Tech 1: "There is a real cloud with real rain in the data center you geek retard! Its shutting down the servers!"

    Tech 2: "Servers shutting down? Maybe the rainfall service is flooding the network with raindrop packets? That would be an

  • It's clouds (Score:5, Funny)

    by rastos1 (601318) on Sunday June 09, 2013 @11:41AM (#43952905) Homepage
    It's clouds all the way ... up?
  • ... they shouldn't have hired that Joe Btfsplk [wikipedia.org] guy for IT support.

  • by gmuslera (3436) on Sunday June 09, 2013 @02:10PM (#43953959) Homepage Journal
    Both clouds were leaking and pissing off users. Facebook must have real sysadmins [xkcd.com].
  • ...When it develops its own atmospheric systems. (that include the water cycle)

  • In the computer section of the UMass Dartmouth Library back around 1990, we had a really humid day one spring/summer inside the Library. There is a part of the library that is open several stories tall and glass windows let the light shine in. We actually had a few drops fall within the library that day. Luckily, no computers were harmed in this event. Super strange to be in the middle of one of these events, that's for sure.

  • And I'm sure that they were happy to be able to ask their friends in the NSA for a backup copy of all their data for restoration :)

"All my life I wanted to be someone; I guess I should have been more specific." -- Jane Wagner

Working...