Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
The Internet IT

Data Centers Work To Reduce Water Usage 225

miller60 writes "As data centers get larger, they are getting thirstier as well. A large server farm can use up to 360,000 gallons of water a day in its cooling systems, a trend that has data center operators looking at ways to reduce their water use and impact on local water utilities. Google says two of its data centers now are "water self-sufficient." The company has built a water treatment plant at its new facility in Belgium, allowing the data center to rely on water from a nearby industrial canal. Microsoft chose San Antonio for a huge data center so it could use the local utility's recycled water ('gray water') service for the 8 million gallons it will use each month."
This discussion has been archived. No new comments can be posted.

Data Centers Work To Reduce Water Usage

Comments Filter:
  • by Ungrounded Lightning ( 62228 ) on Thursday April 09, 2009 @06:48PM (#27525637) Journal

    Microsoft chose San Antonio for a huge data center so it could use the local utility's recycled water ('gray water') service for the 8 million gallons it will use each month."

    I don't know about the rest of you. But *I* certainly don't want to breathe the air near a cooling tower fed with gray water. The risk of Legionella from CLEAN water in a cooling tower's spray that was contaminated by a bit of local dirt is bad enough. Imagine the risk from breathing the dust particles from partially-treated sewage aerosolized to the tune of 180 gallons per minute.

    Sounds like another good reason to avoid Microsoft sites. (Bet they're doing this elsewhere, too.)

  • Re:San Antonio? (Score:5, Informative)

    by ajlitt ( 19055 ) on Thursday April 09, 2009 @06:53PM (#27525691)

    RTFA. The water loss is because many data centers use evaporative cooling towers.

  • Re:San Antonio? (Score:5, Informative)

    by demonbug ( 309515 ) on Thursday April 09, 2009 @07:02PM (#27525761) Journal

    That's the point - the water does get consumed. The simplest (cheapest) way to cool the water after running it through the data center is to use evaporation towers. As the name implies, you lose a substantial portion of the water to evaporation. Evaporation towers are very efficient in terms of power and material costs, but they go through a lot of water. Costs a lot more to construct a closed-loop system - you need some sort of giant radiator to cool the water. Evap tower you just build a hollow box, put some sprayers at the top, a collector at the bottom, and off you go.

  • Re:sooooo ? (Score:3, Informative)

    by h4rr4r ( 612664 ) on Thursday April 09, 2009 @07:02PM (#27525765)

    How is it hard to raise the price?
    In fact just doing that would influence folks not to waste it.

  • by jhw539 ( 982431 ) on Thursday April 09, 2009 @07:36PM (#27526021)
    I'm guessing you must not be from the US because evaporation based cooling systems are THE standard for state of the art industrial and commercial cooling in the US. If you have over 250 tons of load, you have an open cooling tower - dead standard ASHRAE design. The evaporation of water via a cooling tower is THE way you reject heat. If you want to do it dry (as is common in Europe due to much higher fear of Legionella and local code officials freaking out about it), it is FAR less efficient in almost every case, even in monsoon climates like Banglore a wet cooling tower is more efficient.
  • Re:San Antonio? (Score:4, Informative)

    by jhw539 ( 982431 ) on Thursday April 09, 2009 @07:51PM (#27526135)
    Nope. If you're pushing 15 MW out of a couple towers 24/7 they will not freeze up. You do run the cooling tower fans backwards for a few minutes every once in a while to thaw any ice that forms from splashing on the intake louvers, but the tower itself doesn't freeze up. Last time I put a tower into a 0F design climate, I used a dry sump so if the tower wasn't on the basin was dry.

    An annoying fact of physics is that when it gets really cold, evaporative cooling becomes less effective. The air just can't hold much water, and it's the phase change from liquid to vapor that gets rid of your heat. So, it's not freezing that make low temperatures worrisome but actually loss of capacity.

  • Re:Idea (Score:4, Informative)

    by jbengt ( 874751 ) on Thursday April 09, 2009 @09:16PM (#27526807)
    The water is treated with (usually nasty) chemicals to prevent biological contamination, scale buildup, and corrosion.
    The cooling effect comes from water that is evaporated - that's about half the water usage they're talking about.
    What's not evaporated is recirculated, the treatment chemicals and contaminants get concentrated by the evaporation, so some of it is bled off into the sewer and fresh water is added - that's about another half of the water usage.
    It is definitely NOT drinkable; just ask Erin Brokovich.
  • Re:sooooo ? (Score:3, Informative)

    by jbengt ( 874751 ) on Thursday April 09, 2009 @09:58PM (#27527087)

    Yes, but if they move the hot water back into the grid and take in more cold water, they no longer need the evaporators.

    Yes, and they'll only need about 30 times as much water then!
    Not to mention which, I don't want to drink any of that water they put back in the grid after it goes through some faceless company's ill-maintained cooling equipment.
    Anyway, you can't just pump fresh water through refrigeration equipment without destroying it from corrosion, scale build up and biological contamination.

Today is a good day for information-gathering. Read someone else's mail file.

Working...