DNS Root Servers Attacked 311
liquidat and others wrote in with the news that the DNS Root Servers were attacked overnight. It looks like the F, I, and M servers felt the attack and recovered, whereas G (US Department of Defense) and L (ICANN) did less well. Some new botnet flexing its muscle perhaps? AP coverage is here.
move along, nothing to care about (Score:5, Informative)
[RFC2870]
2.3 At any time, each server MUST be able to handle a load of
requests for root data which is three times the measured peak of
such requests on the most loaded server in then current normal
conditions. This is usually expressed in requests per second.
This is intended to ensure continued operation of root services
should two thirds of the servers be taken out of operation,
whether by intent, accident, or malice.
Media: tie attack to likely Windows botnets (Score:2, Informative)
"We made it way harder for guys to do exploits," said Mr. Gates. "The number [of exploits] will be way less because we've done some dramatic things [to improve security] in the code base. Apple hasn't done any of those things."
In another portion of the interview, he added, "Nowadays, security guys break the Mac every single day. Every single day, they come out with a total exploit, your machine can be taken over totally. I dare anybody to do that once a month on the Windows machine."
See article: http://www.toptechnews.com/story.xhtml?story_id=4
Microsoft needs a public shaming for the sorry state of Windows security that allows millions of these zombie machines to exist. I don't blame Joe User, sorry. No holy wars about security; statements that user should do x, y, z and be as smart as me, etc.
Windows: Defective By Design
Re:More root servers? (Score:5, Informative)
The root DNS servers are essential to the function of the Internet, as so many protocols use DNS, either directly or indirectly. They are potential points of failure for the entire Internet. For this reason, there are 13 named root servers worldwide. There are no more root servers because a single DNS reply can only be 512 bytes long; while it is possible to fit 15 root servers in a datagram of this size, the variable size of DNS packets makes it prudent to only have 13 root servers.
Insightful? (Score:2, Informative)
Re:More root servers? (Score:4, Informative)
Re:so a lot of it was from South Korea.... (Score:2, Informative)
Re:Does Anybody Still Distrubute Hosts Files? (Score:3, Informative)
Many of them aren't redundant. (Score:5, Informative)
That's kind of the point here, actually. Several of the root servers do not have any redundancy. You can see the list at http://www.root-servers.org/ [root-servers.org]. In particular, the A, B, D, E, G, H, and L servers have only a single location a piece.
F, I, J, K, and M, on the other hand, are heavily redundant and have multiple geographic locations, routed via Anycast, so a single client only "sees" the server nearest to them. This makes them difficult to DDoS, because a zombie in S. Korea pinging the J server would be sending packets to the server in Seoul, while one in California would get the one in Mountain View.
What's odd, looking at the list, is that anyone operating something as critical to the internet infrastructure, wouldn't develop some geographic and systems redundancy; unfortunately, I suspect that the government agencies in particular tasked with these responsibilities probably don't keep it at the very top of their priority lists when allocating resources and funding.
F machines (Score:5, Informative)
http://www.isc.org/index.pl?/ops/f-root/sites.php [isc.org]
That's about 40 locations. Now, each of which has a couple of servers, a management box, and a couple of routers, so yeah something like 200 machines total.
Re:Ban all Microsoft Users from the Internet... (Score:3, Informative)
Yes, it can be disabled by the user. The user must have Administrative access to disable it, so that might help limit it.
(Control Panel-->User Accounts-->Turn user account control on or off)
Visual Studio requires admin rights to run (OT) (Score:2, Informative)
It's more than just an IDE. I'd hazard a guess that it's for the debugger, so you can do things like trace calls up to kernel functions, access another application's memory area, and use hardware watchpoints. Come to think of it, I wouldn't even know how you'd write a program to access the registers or memory of a process, even a child process. Did read an article on how debug.com worked, but that was a long time ago...
Re:uh oh! (Score:3, Informative)
From RFC 2606:
(Next time, try the webserver -- that's how I learned this.)
Not anymore (Score:5, Informative)
And the primary design feature that enabled that was removed during the rise of the ISPs.
The early internet was a NET. Redundant links everywhere. Routers all potentially knew the whole topology and could find a connection if it existed.
As the net went commercial that caused a table explosion in the routers. So BGP replaced RIP and things became less robust. Usable routes became a subset of all possible routes. Within the backbone there was still a lot of redundancy - but it wasn't quite up to the former "find a path if it exists" level.
Meanwhile, the typical host went from being something ad-hock connected to sever neighbors to being something connected solely to a single ISP - typically by a single link. The big guys might have redundant paths into their ISP's Network Operations Center. But if something took out the NOC (and often there was only one - or only one of some critical component) you were hosed. Ditto if something corrupted their databases. Even with redundant links there would only be a few, perhaps going through several single-points-of-failure - and if fully redundant still allowing a double-failure to take you down. The little guys would typically have one line (say DSL) to one box. Cut the line or crash the box - or the typically two links from it to the NOC - and you're hosed.
(Perhaps you have a dialup-backup for your DSL. Did YOU configure it to come up automagically if your main link goes down? Is it on the same phone line with the DSL? If not, does it take a different path to the central office? Or is it right up the same cable bundle on the same poles next to the same road full of the same drunk drivers or in the same underground cable running past the same backhoe...)
So the internet evolved from a nuclear-strike-survivable net to a less-robust net rooting a bunch of trees. Oops!
(And that's just for routing the packets once you've GOT the IP number. Translating names to IP numbers is a whole separate can of worms: It's what the root servers are about - which is why there are so many of them, most of them are clusters, and some are clusters that are geographically diverse. You only need to hit ONE operational root server to get started on your translation - if your answer isn't cached somewhere between you and the root, and the list is small enough to keep handy on every machine that wants to do its own nameservice.)
Re:Visual Studio requires admin rights to run (OT) (Score:2, Informative)
Re:Ban all Microsoft Users from the Internet... (Score:4, Informative)
Re:Ban all Microsoft Users from the Internet... (Score:4, Informative)
When Microsoft knew they were going to release XP Pro they should have started pushing multi-user features in their developer kits. All authoring systems should have had an option to build for multi-user and all installation kits should have been set up to do the same with a radio button. I suspect that Microsoft did not bother to do this, or they charged extra for it. As it stands out of maybe twenty large and small apps on my system that I paid for recently, only the big ticket items like Mathcad and Photoshop installed and ran properly. Some open-source stuff ran pretty well, too, but they tend to avoid the registry.
In the end I gave up trying to get everything to work. I tried running a few misbehaving apps with "Run as..." but you can not drag and drop between different user areas in Windows due to their separate memory areas (the pointer is inaccessible). So Windows XP Pro turned out to be a waste of money. I feel like I paid extra to beta test Microsoft's software.