Forgot your password?
typodupeerror
Security The Internet

DNS Root Servers Attacked 311

Posted by kdawson
from the flexing-muscles dept.
liquidat and others wrote in with the news that the DNS Root Servers were attacked overnight. It looks like the F, I, and M servers felt the attack and recovered, whereas G (US Department of Defense) and L (ICANN) did less well. Some new botnet flexing its muscle perhaps? AP coverage is here.
This discussion has been archived. No new comments can be posted.

DNS Root Servers Attacked

Comments Filter:
  • Thank goodness... (Score:5, Interesting)

    by NerveGas (168686) on Tuesday February 06, 2007 @05:50PM (#17912852)
    ... for resolving caches.
    • by kevin_conaway (585204) on Tuesday February 06, 2007 @06:01PM (#17913122) Homepage

      ... for resolving caches.

    • ...Botnet disabled, job done!
      • by NerveGas (168686) on Tuesday February 06, 2007 @07:22PM (#17914406)
        It's nice to think that, but I don't *entirely* agree with it.

        Microsoft is an easy target, given the insanely large user-base. However, if those users suddenly switched to Linux, it's doubtful that their practices would stop - they'd still install whichever distribution looked the best, installed 134 unneeded services and enabled them all by default, open unsafe attachments, and never update their computer.

        In every operating system I've seen yet, security is an inconvenience. While you and I think that the tradeoff is worth it, we will always be outnumbered by people who think that it isn't. People who log in as "Administrator" would just as quickly read their email and browse porn sites as "root". Sad, but true.
        • by jamesh (87723) on Tuesday February 06, 2007 @08:02PM (#17914872)

          In every operating system I've seen yet, security is an inconvenience.
          It's nice to read something occasionally not written by a zealot :)

          One of Vista's features is the way that even if you log in with admin privileges, you don't actually have them until you jump through an extra hoop, and even then I think you only have them only as long as necessary. I'm sure that if it has been implemented correctly, it will certainly shorten the amount of self-hanging rope available to the average user.

          I'm also sure that there are lots of people working on a hack to disable this right now. (I've not used Vista so I may be misinformed - there may be a way to disable it easily anyway?)

          And even without that, enough people are gullible enough that if a web site says that to use the available features correctly you need to "follow these simple instructions", it will be done.
          • Re: (Score:3, Informative)

            by Joe U (443617)
            I'm also sure that there are lots of people working on a hack to disable this right now. (I've not used Vista so I may be misinformed - there may be a way to disable it easily anyway?)



            Yes, it can be disabled by the user. The user must have Administrative access to disable it, so that might help limit it.

            (Control Panel-->User Accounts-->Turn user account control on or off)

          • by scatters (864681) <mark@scatters.net> on Tuesday February 06, 2007 @09:25PM (#17915578)
            Are you kidding? I've been using Vista since RTM on my main work system and the UAC prompts are enough to either:

            1: Drive one completely insane.
            2: Insensitize one to the point where one clicks 'Yes' on any dialog that pops up.
            3: Cause one to disable UAC prompting.

            Examples:
            You want to look at the event log... well you're gonna need some extra admin priviledges. Are you sure you want to look at the event log?

            You want to run visual studio 2005... that complains too. Would someone please explain to me WTF running an IDE requires admin fucking rights!

            Microsoft's approach of security by nagging the user to death is fundamentally flawed.

            I swear, if I hadn't turned of UAC prompting, there would be a craig's list posting right now for a slighty shot-gunned compy.

            • by palad1 (571416) on Wednesday February 07, 2007 @05:42AM (#17918688)
              Visual Studio 2005 needs to register some COM components at runtime iirc, thus admin rights are involved.
        • by TapeCutter (624760) on Tuesday February 06, 2007 @10:23PM (#17916048) Journal
          Exactly, and I also get sick of "experts" ridiculing and blaming the victims of vandalisim and crime for messing up "their" playground. Nobody blames a homeowner when a thief kicks down their flimsy door and robs them, or a vandal rips up their mail and knocks down the letterbox.

          As I have been doing for nearly two decades, I set up a friends PC just before christmas, and told him "just say no" to unknown applications. He had no troubles until about a week ago, he got a message from the virus scanner about a trojan and didn't understand the options so he just pulled the plug from the wall, called his bank and waited until next time he saw me.

          The first thing I said to him was..."you said 'yes', didn't you?"...he complained bitterly..."No porn videos, No screensavers" I asked in a mocking accusation...."is a screen saver an application" he replied with a puzzled look. I booted it up and showed him how the scanner gets rid of the trojan and admired his new screen saver. The VS options were something like "vault" and "delete", there wasn't a "no" or "cancel" button so he panicked and enacted the "emergency procedure" I had advised previously.

          The guy is not an idiot, he is middle aged but has had virtually nill exposure to PC's, until he went out and bought one. He restores antique furniture for a living, he is over the moon about ebay and other stuff to do with furniture but has ignored FPS games. Not that he doesn't like them he has a PS3 and loves it because "it doesn't do things that are not in the manual". For him the curve is still too steep (and life is too short) to learn how to install and register games with confidence.
        • Re: (Score:3, Interesting)

          by Fordiman (689627)
          And most Linux users would scream and freak if there was an automatically set-up cron job to apt-get update/upgrade once a week - but will often do so themselves.

          I openly admit to being one of those.
  • Oh (Score:5, Funny)

    by Anonymous Coward on Tuesday February 06, 2007 @05:51PM (#17912876)
    Oh!!! So that's what that button does.
    • Re: (Score:3, Funny)

      by jd (1658)
      DeeDee!!! How many times must I tell you not to press any buttons?
    • Re: (Score:3, Funny)

      by Ecuador (740021)

      Yeah, now let's try the one labeled "Omega 13".
      Hmm... Did it do anything?
  • by Ralph Spoilsport (673134) on Tuesday February 06, 2007 @05:51PM (#17912878) Journal
    OK you South Korean Hackers... What say we let the Dear Leader north of your border come down and show you a little something about responsibility...hmmmm???

    Stupid little freaks.

    RS

    • by NerveGas (168686) on Tuesday February 06, 2007 @05:56PM (#17912998)
      They don't go into a lot of detail, but it's entirely possible that the bots in South Korea were, in fact, being controlled from somewhere else. I'd say that it's even *likely*.
      • So it could have been a government exercise turned into a convenient "ooh-ahh!" media story?
        • by NerveGas (168686)
          It could, but it's more likely that it's either (a) a profit-driven scoundrel or (b) a bored young male somewhere in the world, testing something out. Cyber-crime isn't just for Nigerian kids in Internet cafes or bored young punks, organized crime from all over the world have moved quite heavily into the scene.
      • by jamesh (87723)
        Ah... maybe you've pinpointed the motive behind this attack. It's a setup to make everyone think that South Korea is up to no good...

        And just when I thought I had someone to blame for the 4 Cisco router crashes i've seen in the last 24 hours (3 yesterday, 1 today. Won't let DNS traffic pass until the affected unit is rebooted.)
    • Re: (Score:3, Insightful)

      by Anonymous Coward
      OK you South Korean Hackers...

      All that means is the Botnet was mostly infected computers from South Korea, given the penetration of broadband in that nation its not that surprising. And if it leads to the rest of the intrnet cutting off South Korea, that benefits the North.

      Stupid little freaks.

      You would think Slashdotters would at least understand this basic fact. *sigh*

    • by erbmjw (903229) on Tuesday February 06, 2007 @06:00PM (#17913090)
      Perhaps you and I are reading the article differently, is this the passage you are refering to?

      Experts said the hackers appeared to disguise their origin, but vast amounts of rogue data in the attacks were traced to South Korea.
      That doesn't say to me that the attack originated in South Korea, but rather that many computers in South Korea were being used as botnet zombies.
      • Re: (Score:2, Insightful)

        by Rithiur (736954)
        With the country's software locked to Windows and Internet explorer [slashdot.org], is this honestly a big surprise?
      • South Korea has great residential broadband. It must be a premium place to recruit zombies.
      • Re: (Score:2, Interesting)

        by MadHakish (675408)
        I think the fact that South Korea has something like 99% of connected computers running windows makes them an easy target for infectable machines just based on sheer volume. Combine that with the outstanding penetration of very high-speed internet connectivity and just about everything in the country is running an OS with a poor history of security on a very fast connection..

        In order to make a secure transaction over the internet in South Korea you have to be able to run IE, and ActiveX controls to establis
        • Re: (Score:2, Informative)

          by gregleimbeck (975759)
          This coupled with the fact that piracy is rampant in South Korea, and since last year Microsoft has not allowed a number of updates to copies of Windows that haven't passed WGA validation.
    • by cypherz (155664) *
      If I had mod points I would mod you up just for the Firesign Theatre reference.

      "aw heck no, I'm gonna take off my shoes, climb a tree and learn to play the flute!"

    • by Anonymous Coward on Tuesday February 06, 2007 @07:16PM (#17914334)
      South Korea has :
      1. Almost a 100% windows monoculture (really), because they standardised on an ActiveX control for secure banking etc before SSL was standardised, and everything still needs it
      2. Dirt cheap, fast broadband
      3. Fairly rampant piracy, hence many unpatched machines
      Put it together and you get botnet paradise.
      • Re: (Score:3, Interesting)

        by stuntpope (19736)
        From my anecdotal experience:

        4. A dismissive attitude towards computer security, safety precautions, environmental concerns, building codes, etc. I frequently hear "why bother?" as it's considered an inconvenience, likely cutting into profits, and only a dummy plays by the rules.
    • for those who modded me flamebait, I was trying to be funny.

      Gads - some people have no sense of humour.

      RS

  • And...??? (Score:4, Insightful)

    by Anonymous Coward on Tuesday February 06, 2007 @05:52PM (#17912886)
    Um, so how many times a day do the root servers get attacked? No, wait, an hour, a minute... Like a ba-gillion? These things happen everyday, so what's new? It's not like they haven't figured out the whole failover/fault tolerance thing. You'd have to nuke 'em to get them to stop running.
    • Hey.. thats not a bad idea.
    • It's not like they haven't figured out the whole failover/fault tolerance thing.

      That's kind of the point here, actually. Several of the root servers do not have any redundancy. You can see the list at http://www.root-servers.org/ [root-servers.org]. In particular, the A, B, D, E, G, H, and L servers have only a single location a piece.

      F, I, J, K, and M, on the other hand, are heavily redundant and have multiple geographic locations, routed via Anycast, so a single client only "sees" the server nearest to them. This makes them difficult to DDoS, because a zombie in S. Korea pinging the J server would be sending packets to the server in Seoul, while one in California would get the one in Mountain View.

      What's odd, looking at the list, is that anyone operating something as critical to the internet infrastructure, wouldn't develop some geographic and systems redundancy; unfortunately, I suspect that the government agencies in particular tasked with these responsibilities probably don't keep it at the very top of their priority lists when allocating resources and funding.
      • Several of the root servers do not have any redundancy.

        Having multiple root servers IS the redundancy - originally, and to some extent even now. Big-time redundancy within each one is just (really strong) suspenders to supplement the belt.

        A non-redundant root server is still useful - even if perhaps not always up and/or not capable of drinking as large a firehose of requests as some giant, geographically-diverse, multiple-cluster. All it takes is one response from one server to get your nameserver's searc
  • slashdotted (Score:5, Funny)

    by deopmix (965178) on Tuesday February 06, 2007 @05:53PM (#17912916)
    It's fine they are just slashdotted, give it an hour or two and they will be running just fine again.
  • Perhaps it is unfair of me to say so, but I get the distinct impression that large governmental organizations do not do very well in terms of security until the attack vector is pointed out to them. After that, sometimes they do very well (often using overkill methods), sometimes they do less well - but something usually has to kick the learning curve process into gear.
    • by timeOday (582209) on Tuesday February 06, 2007 @06:11PM (#17913288)
      Don't make the assumption that all DNS servers were attacked equally though.
      • by Panaqqa (927615) *
        Very good point. ut is Defense was in fact targeted and attacked more heavily, then that has potentially ominous undertones beyond the basic fact of a partially successful attack.
    • by Flavio (12072) on Tuesday February 06, 2007 @06:30PM (#17913650)
      You suggest that the Department of Defense's nameserver is badly managed, making an argument by analogy concerning "large governmental organizations". Since you haven't provided a technical argument, your accusation has no merit. Your "distinct impression" is pure speculation.

      But congratulations on getting everyone riled up.
    • Yeah, some idiot posting on /. thinks the guys who invented the internet don't know their stuff. That's entirely fair. Stupid and moronic, but fair.
    • by jd (1658)
      Uh, no. Any organization that does not take IT security seriously will fare badly until the attack vector is not only pointed out to them, but is used to swat them around the ear until they get the message. The DoD is sometimes in this category, and sometimes it isn't. They do manage to go all-out whatever category they happen to be in in that field for that week, so when they do badly, it's grotesquely obvious. When they do well, such as when the BASS group did their Internet Security Audit, it's much less
  • and? (Score:3, Insightful)

    by ReTay (164994) on Tuesday February 06, 2007 @05:54PM (#17912930)
    Is it just me or is going after servers that people expect up to 3 business days to update not the best way to go? You would have to sustain the attack for a long time for the average joe to notice.
    Not that I am complaining, one less bot net to worry about.
    Good thing that they apparently never heard of routers though.
    • Re:and? (Score:5, Insightful)

      by NerveGas (168686) on Tuesday February 06, 2007 @06:01PM (#17913114)
      While it's not exactly an entirely effective attack - resolving caches will, for the most part, insulate end-users from the effects for anywhere from a few hours to a few days - it could be simply an experiment. If you suppose that this was perpetrated by someone who is intent on causing mayhem, they could have been testing how well their attack would work, in order to plan a much larger one which would bring down *all* of the root name servers, and for long enough to really make people feel the squeeze.

      It's a dumb, brute-force type of approach. A much, MUCH more effective way would be to simply find an appropriate flaw in IOS to exploit...

      steve
      • Not all of the root servers may sit behind Cisco equipment
        • Re: (Score:3, Interesting)

          by NerveGas (168686)
          It doesn't matter, it's virtually guaranteed that the path between your resolver and the root name servers involves at least *one* Cisco router.

          And in the unlikely event that it doesn't, it's just as likely that the path between you and where you want your traffic to go involves at least one Cisco router. Between the two, if someone were clever, capable, and dedicated, they could disrupt enough of the Internet to make it 99% unusable.

      • Motive? (Score:3, Interesting)

        >they could have been testing how well their attack would work

        Good insight, but why attack the root servers in the first place?

        The days when people tried to burn down the Internet just to watch the flames dancing ended a few years ago. It's about profit now. If a crook launches a DDoS on a gambling site the day before the Super Bowl, that crook can extort money. Crooks can also make crooked money from click fraud or spam runs.

        Where's the money in taking down the root DNS servers? Why would a crook throw
        • by NerveGas (168686)
          > Good insight, but why attack the root servers in the first place?

          There are still people who see the Internet as being one of the roots of all evil, or as it being one large American/Western institution, and there are still people who just like to be jerks.

          The first two haven't (so far) really had the right combination of resources to do something terribly bad to the Internet, and as time goes on, the last one has definitely faded away - but that's not to say that they're not out there.

          We seem to agree
          • Re: (Score:3, Interesting)

            by Vengeance_au (318990)

            It's also possible that the root servers were just a test target, that once they're ready, they'll go after their *real* target.
            To extend that thought a little bit - being able to show potential clients that your botnet has taken down the DOD and ICANN DNS servers would be a real sweet selling proposition....
    • Re:and? (Score:4, Interesting)

      by timeOday (582209) on Tuesday February 06, 2007 @06:09PM (#17913250)

      Not that I am complaining, one less bot net to worry about.
      No kidding. I'm always impressed how I never even notice these things until they hit the news afterwards. I don't think there's been anything you could reasonably call a general Internet outage in the last 15 years. I guess you could say of course not, because the Internet isn't "a thing," it's a bunch of separate things that just happen to be willing to talk to each other. To which my answer is, I'm sure glad they planned it that way.

      Besides, DNS is for wussies anyways. Real men don't need user-friendly names for their ip addresses :) But seriously, I can imagine the Web still being useful without DNS if search engines linked to IP addresses instead of hostnames. And now that email is largely a WWW service (hotmail, gmail...) a big chunk of it could survive too.

      • Re:and? (Score:5, Interesting)

        by Feyr (449684) on Tuesday February 06, 2007 @06:18PM (#17913430) Journal
        actually, there was one.

        i dont remember the actual day/month/year, but maybe 3 years ago: MCI updated a bunch of routers, all at the same time, and screwed it up. a lot of people in north america were without internet for up to a day. i think this qualifies as major :)
      • That would be a Bad Thing. The reason we have DNS is so that server IP's can change. With the coming of IPv6, IP addresses would be tied to geography, so when your server moved, the search engine would lose track of your site.
    • Insightful? (Score:2, Informative)

      by xyphor (151066)
      The root servers are the authoritative DNS servers for the top level domains (TLDs) - i.e. .com, .net, .edu, etc.... This has nothing to do with the "3 business day" thing you're talking about. Even the TLD servers aren't responsible for that delay. You're referring to the time it takes for non-authoritative DNS servers to clear their caches. Big difference....certainly not "insightful". /x
  • by skynare (777361) on Tuesday February 06, 2007 @05:55PM (#17912982) Journal
    i can still visit slashdot. i think my dell pc has a back up of the internet.
    • by Cow Jones (615566) on Tuesday February 06, 2007 @08:25PM (#17915126)

      i think my dell pc has a back up of the internet.

      Actually, backing up the internet is a very good idea, and it isn't hard to do at all:

      If you're using Windows, just drag and drop the internet (the blue "e" symbol) from your desktop onto your USB stick. Wait for the copying process to finish (with current Windows installations this will only take a few minutes). Next, confirm that you have successfully stored the internet: double-click the internet on your USB stick, and enter any address. Did it work all right? Congratulations! Now you can carry the whole web in your pocket, or give it to your friends as a gift.
  • Actually... (Score:5, Funny)

    by creimer (824291) on Tuesday February 06, 2007 @06:00PM (#17913082) Homepage
    Some new botnet flexing its muscle perhaps.

    That was a test system [youtube.com] for installing Windows Vista that someone forgot to unplug from the wall.
  • by Geekboy(Wizard) (87906) <.spambox. .at. .theapt.org.> on Tuesday February 06, 2007 @06:16PM (#17913400) Homepage Journal
    the root servers are setup in such a way that *2/3* of them can fail, and noone would notice.

    [RFC2870]
          2.3 At any time, each server MUST be able to handle a load of
                  requests for root data which is three times the measured peak of
                  such requests on the most loaded server in then current normal
                  conditions. This is usually expressed in requests per second.
                  This is intended to ensure continued operation of root services
                  should two thirds of the servers be taken out of operation,
                  whether by intent, accident, or malice.
  • Mr. Bill recently said this:

    "We made it way harder for guys to do exploits," said Mr. Gates. "The number [of exploits] will be way less because we've done some dramatic things [to improve security] in the code base. Apple hasn't done any of those things."

    In another portion of the interview, he added, "Nowadays, security guys break the Mac every single day. Every single day, they come out with a total exploit, your machine can be taken over totally. I dare anybody to do that once a month on the Windows machi
  • South Korea, eh? (Score:5, Interesting)

    by Quantam (870027) on Tuesday February 06, 2007 @06:28PM (#17913624) Homepage
    Other experts said the hackers appeared to disguise their origin, but vast amounts of rogue data in the attacks were traced to South Korea.

    Somehow that doesn't surprise me. This is the same country that uses insane amounts of ActiveX, and has the effect of conditioning people to click "Yes" whenever any site tries to install something, right? Wouldn't be any surprise if South Korea was one big botnet.
    • by Dunbal (464142)
      Wouldn't be any surprise if South Korea was one big botnet.


            Run by the one internet machine in N Korea?
    • Re: (Score:3, Interesting)

      by element-o.p. (939033)

      Wouldn't be any surprise if South Korea was one big botnet.

      Have you ever looked in the log files of a mail server? S. Korea is one big botnet. Any time I find an IP address that reverses to a Korean ISP, I blacklist the entire class C--especially if it's a kornet.net or hanaro.com IP address.
  • 130+ root servers (Score:3, Interesting)

    by karl.auerbach (157250) on Tuesday February 06, 2007 @06:32PM (#17913678) Homepage
    A few years ago the root server operators (on their own initiative and without asking for, or obtaining, permission from ICANN) took the wise step of deploying replica servers using a routing technique called "anycast". Thus under the name of, for example, f.root-servers.net there are many distinct servers geographically dispersed.

    Consequently today we have more than 130 root servers scattered around the world.

    That's good. It tends to localize the damage caused by attacks.

    What is not good is that these root server operators, although they today operate to the highest of standards and with the highest degree of integrity, are not required to do so in the future.

    For example, several root servers are operated by the US military establishment or by other branches of the US government and are thus subject to being "adjusted" according to military, political, or Atty General Alberto Gonzolez's latest desire to do data mining.

    Nor are the root servers required to play fair and respond to all queries with equal dispatch or equal accuracy no matter the source or the name being queried for.

    Nor are the root servers off limits for sale to companies like Microsoft or Google who could use them for commercial data mining.

    Many people believe that ICANN serves as a kind of fire marshall, overseeing that the root servers are operated responsibly and that the root server operators have access to the resources they might need to recover from a natural or human disaster.

    But that is not the case. ICANN has abrogated that role and has engaged itself as a protector of trademarks and US cultural values.

    Over the last few thousand years we've learned that it's best for long term stability to build institutions and not depend on individual people. Today the root servers are the work of good individuals and organizations that encompass them. We really need to move to a more formalized structure that reinforces the long-term continuation of the good system we have today.
    • Re: (Score:3, Insightful)

      by Thundersnatch (671481)

      Over the last few thousand years we've learned that it's best for long term stability to build institutions and not depend on individual people. Today the root servers are the work of good individuals and organizations that encompass them. We really need to move to a more formalized structure that reinforces the long-term continuation of the good system we have today.

      Wow, you have that entirely backwards. The last few thousand years have tought us that institutions generally suck at fulfilling the needs o

    • by Rufus211 (221883) <rufus-slashdot&hackish,org> on Tuesday February 06, 2007 @09:12PM (#17915482) Homepage
      Sorry to burst your conspiracy theory, but data mining the root name servers would be next to useless. These are the Root name servers and as such all they know about are TLD (top level domains). You ask one of the roots "who is in charge of .com" or .edu or .uk, and they respond. The only data you could ever get from them is distribution among TLDs. Now add caching name servers into the equation (99.999999% of boxes on the internet are behind one) and the statistics becomes even more useless. The records returned by the roots have a lifetime of 2 days. This means it doesn't matter if there's 1 client or 1 million clients behind a particular caching name server, it's only going to ask about .com every 2 days.

      >We really need to move to a more formalized structure that reinforces the long-term continuation of the good system we have today.
      And who's going to run that formalized structure? Hrm, maybe some "good individuals and organizations" would be willing to do it?
      • Re: (Score:3, Interesting)

        by wayne (1579)

        Sorry to burst your conspiracy theory,

        Before "correcting" Karl Auerbach, you might want to to see just how many google RFC's he has been involved with [google.com], not to mention being kicked off the ICANN board for trying to stand up for the individual.

        ... but data mining the root name servers would be next to useless. These are the Root name servers and as such all they know about are TLD (top level domains). You ask one of the roots "who is in charge of .com" or .edu or .uk, and they respond. The only data yo

  • Someone did a query

    53 security.microsoft.com ptr

    The record that cannot be resolved.
  • I just installed a caching-only nameserver on my home machine last night. Nice speed boost. Not that has anything to do with this other than being DNS. I'm just sayin'. I hope my install didn't mess up the root servers. :)
  • More root servers? (Score:5, Insightful)

    by TooMuchToDo (882796) on Tuesday February 06, 2007 @06:44PM (#17913886)
    Silly question. Why aren't there more root servers put into operation? (Honest question! I seriously don't know. Is it a technical limitation?)
    • by Yaksha42 (856623) on Tuesday February 06, 2007 @06:55PM (#17914060)
      http://en.wikipedia.org/wiki/DNS_root_zone [wikipedia.org]

      The root DNS servers are essential to the function of the Internet, as so many protocols use DNS, either directly or indirectly. They are potential points of failure for the entire Internet. For this reason, there are 13 named root servers worldwide. There are no more root servers because a single DNS reply can only be 512 bytes long; while it is possible to fit 15 root servers in a datagram of this size, the variable size of DNS packets makes it prudent to only have 13 root servers.
      • Thanks for the info. I should've known better to go look at Wikipedia first.
      • by Tim the Gecko (745081) on Tuesday February 06, 2007 @07:24PM (#17914438)
        Although there are only 13 IP addresses some of them are used by multiple physical servers. Wikipedia again...

        the C, F, I, J, K and M servers now exist in multiple locations on different continents, using anycast announcements to provide a decentralized service. As a result most of the physical, rather than nominal, root servers are now outside the United States
        Last year the K server alone was present in 17 places. Examples are Delhi, Novosibirsk and Miami. Another poster above says the total for A through M is 130 servers, which is impressive!
  • by kestasjk (933987) * on Tuesday February 06, 2007 @06:53PM (#17914012) Homepage
    ... gets slashdotted, what an irony.
  • As in: I've fallen and ICAAN'T get up.
  • It looks like the letters F, I, and M were attacked and recovered, whereas G (US Department of Defense) and L (ICANN) did less well.
    Faster than a rolling 'O'
    Stronger than silent 'E'
    Able to leap capital 'T' in a single bound!
    It's a word, it's a plan...it's Letterman! [wikipedia.org] (majestic three-note fanfare)

Per buck you get more computing action with the small computer. -- R.W. Hamming

Working...