
Don't Forget That Worms Happen Everywhere 391
friday2k writes "Securityfocus has a nice column on Worms and their origin in 1988. It explains what everybody should never forget. We have dealt with *NIX worms (Sadmind, li0n, ...) and they will come back again. Maybe then the MS fanatics will laugh and say: didn't we always tell you Open Source is insecure (too?) ..."
Microsoft products seem to be of very low quality. (Score:2)
The major issue is not whether Linux can have worms. The major issue is that Microsoft products seem to be of very low quality. Extremely poor security is only one aspect of that.
No Linux email programs or word-processing programs have the authority to take over the entire operating system. Microsoft products sometimes do.
Many of the security bugs in Microsoft products seem to come from sloppy programming. The open source world would have a difficult time being as sloppy.
The popular Linux programs give a general impression of quality, and of sincerely wanting to do a good job. Microsoft programs give the general impression (to me) that Microsoft wants to give as little as possible to the customer, so that the customer will feel motivated to upgrade.
Re:Microsoft products seem to be of very low quali (Score:2)
So what happens if the BSD TCP Stack is found to have such an overflow error? This would automatically infect ALL systems I can think of, who doesn't use BSD's stack today?
Re:Microsoft products seem to be of very low quali (Score:2, Informative)
Re:Microsoft products seem to be of very low quali (Score:2, Troll)
Ya Right, run as root *and* run untrusted code. Sounds like a typical windows user executing an email attachment to me. Informative my ass...more like typical M$ thinking
This is why we create user accounts. This is why we run suspicious code in that account in the first place. You gonna send the code with that VIM?. How are you gonna hide the exploit? Geez...I'll bet you're one of those accessing Slashdot through IE right?
Difference (Score:3, Insightful)
An NT/2000 sysadmin is a secretary who reboots when the internet thingy stop hoogjamajigging, in the best of all possible worlds.
Seriously, in tracking down a couple of thousand hosts on campus who had Code Red, I have never ran into such righteous indignation over a simple lecture on systems maintenance as patching. Of course, many of these users/sysadmins were dumbasses who installed Win2K server because they could, not because they had to. 3 machines in one room were being used as everyday workstations and not offering services for any particular use by the office. Mind you, the services were still offered. Hit the average Code Red machine with your web browser and you will see the default webpage.
Re:Difference (Score:2)
apt-get update
apt-get upgrade
Another great difference that should be accounted for is the ease of learning how to run Linux. Oh sure, it looks harder, but the information is available and it's SO MUCH EASIER to really know what you are doing than it is to trust a particular vendor. Greif, it's hard to keep a single MS box running. The cloud of BS that MS keeps its users under is awful and we should be nicer to those suffering there.
Re:Difference (Score:2)
And what is the controversy here? In the open source software world, the software and the admins and the developers are one and the same. The basic fact is that the users have a vested interest in the software. A Linux user is not a passive consumer. When something is broken, the users will look around patches and bug reports. They might even debug the problem, fix it, and submit their own bug fix. Or if they're good at that particular application, they might write their own version and release it to the community.
If something breaks in MicroSoftie land, the user or admin is pretty much screwed, and even if they were qualified to troubleshoot the bug, their only meaningful course of action, is to kick their machine, and send some prayers or curses to Redmond, and wait for a fix, that hopefully does not break everything else. The average MS user knows about mouse, excel , and Solitaire, maybe even nethood, shared drives and printers - after friggin 10 years of using the crap.
A Linux user might know how to set up a pop or imap server, can tell you what an MX record is, knows how to fix routing problems, and how to compile a kernel. Most of that stuff should be possible to pick up in a couple of years. It's not because Linux user's are complete geeks, but because they are allowed to have a look under the hood, and can learn it inside out instead of outside in.
Re:Difference (Score:2, Insightful)
different cultures... (Score:5, Insightful)
There is no reason why all those home systems and corporate desktops should have IIS running in the first place. There is also no reason (generally) for a home linux system to be running, say, BIND or wu-ftpd.
So why does Microsoft encourage the installation of unneccessary software on it's systems, and why doesn't it make it easier to not install those services in the first place?
It comes down to culture. Unix-like operating systems are minimalist and modular, because the development communities appreciate elegant code (not neccessarily elegant interfaces).
Whereas Microsoft prizes a DWIM (Do What I Mean) approach, which encourages adding functionality 'just-in-case', as Microsoft seems to think that actually asking a user to install a component is a failure on their part.
In the long run, elegant, minimalistic code is easier to understand, and therefore easier to secure (examples are Sendmail vs. qmail, or BIND vs. djbdns).
Home-based BIND (Score:2)
1) I haven't had outages because my @home DNS servers have gone to lunch, and
2) I've gotten rid of a lot of junk after setting up some bogus entries for doubleclick.{net|com} and x11.com.
I agree that there's no reason for most home users to have a BIND system visible to the net at large, but there are some pretty good reasons for one if it can be located behind your firewall.
Re:Home-based BIND (Score:2)
Meanwhile, while you're downloading your cracking tools you might want to reread my comment. I know that it's possible to break past a masquerading firewall, but I doubt I can do much to stop someone with that much technical expertise anyway.
You might also want to look into modern package managers, especially <tt>apt-get update</tt>. It's not that hard to check for security patches once a week, or whenever I learn of a new release.
Re:different cultures... (Score:2)
It would have been curious to hear you make that same statement back in 1992, when I first started working with Linux and having 16 Megs of RAM to run X11 was considered a luxury.
You know Windows 2000 comes with a telnet server? It's installed, but not started by default.
Can you say the same about most Unix distributions? No.
Furthermore Redhat for the longest time went off and installed a whole load of services by default. My Solaris install at home has sendmail running by default. Do I need sendmail? No.
I think you'd like to believe what you are saying. But I really don't find a whole lot of evidence to support it as fact.
Sendmail? Elegant? Minimalistic? (Score:5, Funny)
Re:Sendmail? Elegant? Minimalistic? (Score:2)
Re:different cultures... (Score:2)
Re:different cultures... (Score:2)
Long before Microsoft entered into the scene with NT, Vendors such as Sun were selling UNIX servers and workstations. True, this mostly referred to hardware configurations rather that OS configurations, but that was simply reflective of the fact that they were hardware vendors rather than software vendors.
Re:Ahem... (Score:2)
Check out dnscache [cr.yp.to] which is part of the djbdns package.
Worms first spotted in 1988 (Score:2, Funny)
Who should we send the wormsign spotting bonus to?
Dammit, where are those carryalls??!?!?!
InigoMontoya(tm)
not exactly an MS fanatic, but... (Score:3, Informative)
Notice that the level of representation of MS products is quite low. Consider that the Open Source Community's conventional wisdom is that closed source leads to insecurity. I am risking the almighty flame when I say so, but here it is: Monoclonal OS prevalence is the issue, not open source versus closed source.
What I am saying is that the OS with the greatest market share attracts the hackers the most because they get the most "bang for the buck."
But two conclusions can be drawn about this observation, one good, one bad:
The good: the move towards an "OS ecosystem" of various flavors of OS is the healthiest for the Internet. Because if something like Code Red were to reappear, only a minority portion of the pie chart of OS prevalance would succumb, as opposed to the majority slice. I use the biological allegories "monoclonal" amd "ecosystyem" because you can say the same thing about crop resistance to insect/ bacterial/ fungal/ viral pests: the more the genetic similarity of crops, the greater the risk of one solitary biological pest taking out all of the Midwest as opposed to one cornfield.
The bad: Microsoft, having the greatest exposure to exploits now, is getting the most experience with dealing with exploits. Dealing with them at a business, PR, and technical level. The more you fight a war, the better you get at it, and Microsoft will only get better and better at it, the general public will only grow more and more confident with their fight, and less and less exploits will be discovered. Other OSs haven't borne the brunt of the kind of hacker attention yet that fosters this kind of improvement, unfortunately for us all, who live in the ecosystem of the Internet.
Agreed with comments... (Score:2)
The worm that takes everyone offline will exploit multiple holes in multiple operating systems and network services. It may very well operate in a stealth mode, trying to stay under the radar for as long as possible instead of defacing web sites and leaving obvious back doors. It may make a coordinated search of the IP space as described in a recent article.
We are cursed to live in interesting times...
Re:Agreed with comments... (Score:2)
The sad thing is, these were fixed almost immediately in all the respective OSes, but it took quite a while for people to apply the patches.
44% applicable exploits, 25% of servers, not good (Score:2)
New Linux boxes hitting the net aren't arriving with known superuser vulnerabilities (except one in Samba, difficult to exploit, not installed by default, configured unusably by default even if installed, and you'd have to be a bean-head to expose SMB to the Internet anyway; I get SMB probes several times per hour per IP during the quiet periods); new Win2k boxes hitting the net are arriving with known superuser vulnerabilities.
You left off a qualifier: ``by Microsoft.'' Crackers will continue to find exploits, and one day, one of them will release the worm-to-end-all-worms for IIS. I favour one which installs Linux, copies across the existing services, and sets up shop as a P2P server for its children to download from. Wouldn't it be fun to see all of the penguins popping up on the screens in a Windows server farm? (-:
What happens when there isn't a patch ready? (Score:3, Interesting)
I'm waiting for the time when a worm comes out that exploits a vulnerability that has yet to be 'discovered' yet.
All that has to happen is for a worm writer to be the first person to find a vunerability. Then (assuming that this person is malicious) thier worm would have a tremendous advantage. They would be garanteed that every single server running that particular OS would be open to attack. If they took the time to write a really nasty worm (say it's set to replicate itself 10 times and then try and erase everything it can reach on the networks it has access to, except itself) this would quite assuredly bring a large proportion of the internet to a grinding halt.
And you know it's got to happen some day...
Re:What happens when there isn't a patch ready? (Score:2)
For instance:
Code Red looked specifically for default.ida, which invoked index server. So, shut down index server if you don't need it. If you do, rename or delete default.ida and hope and watch until a patch comes out.
Re:What happens when there isn't a patch ready? (Score:2)
The problem was with the index ISAPI filter, and you had to either delete that, or just remove it
There are many of us who didn't have problems with Code Red specifically because we had made these changes last year before there was a known problem, patch, exploit, etc.
Microsoft has also learned from that mistake, and supposedly IIS6 in XP doesn't install this crap by default.
My two cents... (Score:3, Interesting)
Do I have any numbers for this? Nope... I'll leave that for somebody else to dig up. I'm a BugTraq reader, and I'm amazed at the sheer number of serious IIS eploits that have recently been coming out. I haven't seen anything new in the past few weeks, which is good, but take a look at the sheer number of buffer overflows alone that have been found in IIS lately. I bet it's more, or really close, to the total number of buffer overflows found in things like sendmail, bind, apache, and event telnetd in the same time span.
As a programmer I'm appauled here by IIS. Buffer overflows are old, but they keep coming back up. IIS is a new product, most likely written entirely in C++, which should be making the string handling much simpler than the C counter parts. These IIS holes are coming but due to either laziness, incompetence, or indifference in the MS coders parts. Theese aren't obscure either. You request a long URL and you overflow a buffer? 'Cmon here. The URL is coming from untrusted users -always-. Access point #1 into the system isn't even being looked at for possible holes... over and over.
One would think (read: hope) that MS has got a slew of people over-looking all areas of IIS for possible buffer overflows right now. Maybe they'll actually fix some before they're found? Doubtful... given their track record of re-active security.
Justin Buist
Clueless Author (Score:2)
Excepting the Morris worm, before which nobody cared much about Internet security, all of these worms have one thing in common: the exploited holes were discovered months before the worm, and official patches for the affected packages were widely available.
This was true for the Morris worm as well. Both the sendmail and fingerd issues being exploited by the worm were fairly well-known at the time of the exploit. If I recall correctly, part of the reason that Morris wrote the worm was because of his frustration over the continued presence of these security holes, and paradoxically, part of the reason that he released it prematurely was because one of holes had suddenly gotten extra attention.
at the risk of being redundant (Score:3, Insightful)
...and wrong... (-: (Score:2)
Tony-A's answer was succinct, but I'd like to add that you're ignoring both the frequency and the quality of vulnerabilities on each system. More of the Unix holes are mere DoSes and/or extremely difficult to exploit than is the case for Windows, and when an exploitable hole is more than a DoS it often either requires local access and/or only gives you the provs of the user running the service (e.g. `apache' or `nobody') rather than open slather.
Those are big differences and largely independent of administration.
NT4 SP6-not-A (Score:2)
Ah. I think I know what you're talking about.
It's worse than that (Score:2)
The big issue with Exchange is that it appears to have evolved, conceptually at least, from Microsoft's ancient single-user-OS mailer programs. As with most Microsoft software, when things go wrong, they go totally wrong (the wings fall off rather than the engines simply stopping).
PostFix (to pick a competing service that I use daily) is the exact opposite: it has been componentised almost to excess, no piece trusts another (to say nothing of the trust not accorded to information from the outside world), no piece runs with more privs than it needs, no piece does anything it doesn't need to, sharing is painfully minimalist, and finally it understands timesharing and user separation from the core outwards. Best of all, you don't need to lose these layers of safety to add something like calendaring to it (just add another delivery method).
When was the last time you heard of an exploitable root vulnerability - or even a read-everyone's-mail vulnerability - in PostFix?
One clear observation kills many a fine theory (Score:2)
If even one of them is professionally administered, your point is made. Inconvenient facts are the terror of grand and popular theories. (-:
Why use the past tense... (Score:2, Insightful)
I don't know about anyone else, but I'm still getting hundreds of CodeRed attacks every week.
I'm not paying to spread viruses (Score:2, Interesting)
The OS argument always seems to be about quality, but I'm also interested in the esotaric aspects of it - if you're gunna get rich off something, than it had better damn well work; if you do it out of the kindness of your heart and/or scientific curiousity and research, well
Get Real (Score:2, Interesting)
Yes of course we remember the *nix worms. Here's another thing to remember. *nix will never be the veritable screen door of security holes that M$ products are. I find "Whistler" to be aptly named.
I wonder what would happen if IT professionals were paid $1 per machine for each security update. Guess TCO with M$ products would go through the roof eh? One particular week this year would have netted me $600.
It is all about the Admins (Score:4, Redundant)
Sorry about the spelling, I really need to get a spell checker plugin for
It is all about not reading documentation (Score:2)
Home users get a PC with the promise of easy to use blah blah and a handful of killer apps. It doesn't matter much if its Redhat or MS, if you don't understand the security aspects of being on-line you shouldn't be running a server.
This worm is pretty benign, no deleted system files or content just a big fat backdoor. Its all over the media but I'm really curious if the average @home user got any real message out of this. Maybe they just know to download the patch because its on Cnet and run IIS with one security patch. Ideally, the message should be to get ALL the patches if you're planning on running IIS and subscribe to MS's security list. From what I've read in the media, its probably the former.
NO NO NO NO NO NO NO NO NO NO NO NO NO NO NO! (Score:4, Informative)
"Sooner or later" is effectively a LIE because whether it's sooner or it's later makes a huge difference in securityville. You're also ignoring the ``quality'' of the intrusion (such as carte blanche versus mere DoS).
Me for later, much later. While I could do even better, I use Mandrake 8.0 for production work. It's a bit bleeding edge in some ways - and I pay for that - but it comes with two massive advantages over many Linux distros: it installs reasonably securely unless you tell it not to (warns you when you install world-visible services and if you choose a "high security" install even disables those), and it can automagically update itself. Debian users in particular have long had these comforts.
All Linuces have at least five huge additional advantages over Windows:
Yes, administration makes a big difference, but all OSes are a loooooong way from interchangeable when it comes to vulnerability.
Don't forget Morris! (Score:5, Funny)
Imagine Code Red in which almost all servers are NT/IIS and there is no web, no central authority, no "experts"...
It caused the Inet as it was to cease to function. People had to pull their boxes off-line to keep from getting repeatedly infected.
The confusion and panic that followed lead to the creation of CNet and was the start of most of the big, early Inet security organizations that exist today.
<old codger>
You young whippersnappers don't know from worms. We used to create worms on punch cards and you had to mail them around to get infected! Those were the days!
</old codger>
I suddenly feel old and have to go lie down....
=tkk
Re:Don't forget Morris! (Score:2)
Actually there WAS a game in the mid 70's that reproduced itself on UNIVAC's with tapes that were send around; details here [fourmilab.ch].
Re:Don't forget Morris! (Score:2)
Evolving worms would be neat AI (Score:2, Insightful)
I realize it would take millions of generations before this happened, but once it did, it might become a very robust worm, and one that eats a lot of memory. All it would take is a few dedicated computers and some incredible Darwinian selection methods for it to occur.
If that happens... (Score:5, Funny)
That should make the point of the superiority of Linux worms over Windows worms and end all the FUD.
Don't for get that they are released under GPL (Score:2)
Re:Don't for get that they are released under GPL (Score:3, Funny)
It can happen (Score:5, Insightful)
RHN WAS a solution for that (Score:2)
However, a nightly apt-get against security.debian.org is a VERY good way to patch your system for holes. Debian is really good about releasing quick fixes to their packages.
Red Hat Network may or may not be good about keeping your system completely up to date. I don't know, because I am not willing to shell out a monthly amount of money for keeping my free system up to date.
Really, I don't think MOST people are willing to pay for this sadly necessary excercise in security. By charging for this functionality, Red Hat is reducing the security of a large portion of the installed linux servers. It is simply going to create a bad rep for all of the linux community when worms start to work they way around linux servers using old vulnerabilities. Users with systems that automatically patch themselves will sleep fairly soundly (of course, there is a 24 hour time frame between every time you patch yourself. In the meantime, someone MIGHT have found an exploit and created a worm that utilizes that exploit).
I realize they are in the money-making business. However, they are also representatives for linux. I think they need to be gently prodded to either make red hat network a one-time fee, or totally free.
Oh - and I DO know that patching alone is not enough. You also need to use secure services, and as few services as possible with explicit firewall rules for controlling who can access those services, plus making a good security policy altogether (most important).
Re:It can happen (Score:2)
Re:It can happen (Score:2)
Yeah, I have used them. Impressive auditing too, I must admit. But we are discussing home users, most of whom are not running Win2K Adv. Server.
Re:It can happen (Score:5, Insightful)
Also, Microsoft is supposed to be open to XP configuration changes by the hardware vendors. Does that extend to default security settings? If so, we can only hope that PC Magazine and the rest will rate new computers on how secure they are out-of-the-box. Are Dell, Compaq, Gateway, and the others listening? Is the computer press listening? If I know Dells come secure but Gateways ship Microsoft-default-wide-open, I'll recommend Dell to my friends and family. If I know Debian comes secure but RedHat installs wide open I'll recommend Debian. But only if I know, and I'll only know if the press does their job and tells me.
This is a social problem, not a technical problem, and it requires a social solution. That means that everyone in the society must play their part -- the companies, the press, and the consumers. If Microsoft won't be a good citizen, bad on them. But why should they be a good citizen if their enemies are not, and especially if their friends are not?
Quit FUDing Red Hat (Score:2)
First, I'll wager there are just as many or more Red Hat with Apatche run by someone who does not even know it's there. I know, because I ran one that way. The boogey men did not come and get me for the month or two I had it that way. Why? Because Red Hat 6.2 had far fewer holes by rational design than MS trash which is driven by marketroids.
Second, they have tightened things up. 7.1 comes with a graphically configurable firewall, and bugs you about it on install. That's a big step from the "Everything" install of long ago. It may not be as tight as Debian, and really I must recomend Debian too, but it's not nice to FUD unless you are sure of what you say.
All of the Linux distros are doing good things for teaching their users security. It's in the design and philosopy of free and open software to teach users. If man pages, online help and Slashdot are not enough, you can always fall back to the stone age dead tree intructions.
There is a technical solution (Score:2)
While I agree that there is a social element to this problem, I think that there is definitely a technical solution: firewalls.
Personally, I would never attach a computer to the internet unless it was a firewall, or was protected by a firewall. It does not have to be a hardware solution (although that is preferable, and those black-box firewall devices are ideal for home use), PCs can run personal firewall code as well.
Being behind a firewall is no guarantee that you won't get 0wned, and is no substitute for secure-by-default operating systems, but it is an important part of securing your system.
Re:It can happen (Score:2)
About the best the OEMs are willing to do is bundle Norton Antivirus and maybe a software firewall.
Re:It can happen (Score:2)
I was quite happy to see both of these things, by the way; keep up the good work.
7.2 will be even better.
Um... doesn't this contradict your previous sentence? Or will 7.2 start -1 network services, and physically unplug your ethernet cable?
Code Red (Score:5, Funny)
WindowsWorm:Whitehouse.gov::LinuxWorm:?? (Score:2, Funny)
Talk about biting the hand that feeds you!
The real issue... (Score:2)
In the case of the internet mail worm, the function of the worm was based on unanticipated behaviors of both the worm code (the author had intended the worm to limit its speed of propagation) and the internet mail system (the author was exploiting a bug in the mail transfer agent). Clearly, this sort of situation, while a threat to security, is easily remedied once the exploit is known. The remedy can even be implemented with little or no effect on daily operations, since the erroneous behavior of the program will not have been used as part of any applications.
In the case of the various Outlook worms, however, the situation is reversed. The worms rely on explicit features of the Outlook suite for their functioning. These same features have been incorporated into all sorts of applications built upon the Outlook suite, which means that in order to disable the worm, many production applications must be modified or discarded.
This is a design issue, at its heart. There are some cultural effects involved (e.g. the MS assumption of a monoclonal computing environment leads to the expectation, and exploitation, of features that would not be reliably present in a heterogeneous enviornment.) but the central problem is the explicit decision by Outlook program managers to include features that were inherently insecure. (Consider that, while Sun may have a similar monoclonal outlook to Microsoft, Java was designed for both security and provision of a wide and reliable feature set)
The question is not "can worms be written for systems other than Microsoft's?" -- to which the answer must always be 'yes', even if only because we can't rule out the possibility entirely -- but, rather, "is it easier or harder to write worms for Microsoft systems than for other systems?" The answer is, pretty clearly, that Microsoft's design decisions make worms far easier to implement on MS platforms than on other platforms.
except (Score:5, Insightful)
This ignores an important issue or two (Score:2)
1) A box should come with only absolutely absolutely necessary web services running. Anything else should require the admin manually to turn the service on. This would prevent about 90% of all worm cracks.
2) The providers of a distro have a responsbility to ensure that security updates get to all people affected - not just those who subscribe to mailing lists. They have a responsibility to ensure that fixes are easy to get and easy to apply. Debian probably has the best security model in this regard due to apt-get.
Microsoft fails on all fronts. They ship NT server and Windows2000 server with IIS enabled by default. They do not push publicity out about worms that impact their systems - they make a low key effort to acknowledge that they have a problem only when they have a fix.
Redhat has also been particularly poor in this regard in the past - more recent installs seem not to enable internet server software by default, and to include warnings when you enable things.
Whereas Microsoft software is buggier and less secure than any other software, they also fail to enable their users when security fails. For this the blame goes squarely on the shoulders of a giant that banks $1 billion per month for avoiding bad publicity in order to help their users.
Re: (Score:2)
Not quite (Score:4, Insightful)
Your basic premise is correct that there are more people trying to break MS systems than Unix/Linux systems, but U/L will never be as vulnerable for a number of reasons:
1.) There are several flavors of Unix and dozens/hundreds of distributions of Linux, not to mention all the different version numbers of each of those. This would dramatically impede the spread of any worm. Almost every MS-based site has IIS 5.0 and it is this homogeneousness the allows things like Code Red to spread so quickly and effectively.
2.) Unix/Linux systems in general are easier and safer to patch. Almost every MS patch requires a system restart and it is not at all unusual for the patch to break something else. I have never had a security update break anything on my Debian systems, nor have I ever had to restart the whole system. The service updated (such as the recent Horde/IMP updates) is restarted and the user doesn't even know, even if he/she is using the system at that moment (I know this because I did it as a test case here at work. Someone was reading their email on our IMP system while I upgraded the system. Yeah, a bit dangerous, but we're a small company and no one would have gotten in trouble. Regardless, she didn't even know anything had happened).
3.) Security holes are much more frequent on MS systems. We all have heard about the fact that the last known remote root exploit for Apache was over 3 1/2 years ago. There have been a few security patches since then, but nothing nearly so troublesome as Code Red. I read somewhere that there have been over 40 serious holes in IIS this year alone, although I don't remember where I read it and it may be apocryphal.
Bottom line is that while it may be true that if as many people who are attacking MS systems starting attacking Unix/Linux systems, we might see more issues on U/L, it is also true that Unix & Linux are better engineered from the start, easier to upgrade and more varied, all of which make them much more secure inherantly than MS solutions.
Cheers...........
Re:except (Score:2)
HEY! (Score:3, Funny)
err...
The Point Is (Score:3, Informative)
I've got a question... (Score:2, Funny)
Okay, if worms appearded in 1988, then what the hell ate all the dead bodies in the thousands of years ago?
Blame the language (Score:3, Interesting)
If we were coding our network software in a secure ("safe") language (one without buffer-overflow "capabilities") such as Java, O'Caml, (or even scripting languages like Python, to an extent) we would greatly reduce our security risk. Given that these languages also typically increase productivity, it seems like a clear win to me...
Microsoft realizes the contribution C and C++ make against stability and security; they've recently hired up a lot of famous programming language folks to work on new language technologies. Microsoft knows that large projects written in languages without sophisticated modularity constructs (ie C, C++) tend to get out of hand quickly. They're working to fix this! They're even working on technologies to improve the stability of device drivers through language technologies (see the Vault project, for instance).
However, C has always been the UNIX platform's language. Will UNIX stay in the 60s as even Microsoft moves on? If so, I say it will be the "wormy" operating system family of the 21st century...
*nix admins better than NT admins? (Score:3, Informative)
I most heartily disagree. Sure, there are *some* *nix admins that mop the floor with NT admins... but the opposite is also true.
I think we are all forgetting exactly what an "admin" is. An admin is *not* any JoeBlow@aol.com that stands up a web server! A system administrator is an IT professional who researches his work and prides himself on keeping his machines running smoothly.
If you think about it a little, I believe that you'll agree that the major cause of the whole Code Red problem is not the NT admins out there, but rather the JoeBlow@aol.com's who really don't know what they're doing. Ignorance, people... ignorance is our enemy! Not Bill Gates, not MS, not closed source! It's ignorance and apathy.
I'm a heretic, baby (Score:5, Insightful)
Nah, you are a spreader of faith (Score:2)
Re:I'm a heretic, baby (Score:2)
Re:I'm a heretic, baby (Score:2)
Re:I'm a heretic, baby (Score:4, Insightful)
Re:I'm a heretic, baby (Score:3, Insightful)
Change your password after, of course. Now if only there were an equivalent way to get people to use PGP...
Re:I'm a heretic, baby (Score:2)
And I think that redhat update lets me be a lot more lazy than any NT admin. 2 clicks, downloads and installs all the patches. Doesn't get much easier than that.
Re:I'm a heretic, baby (Score:2)
Indeed. NT Server asks to install IIS during its installation, and it's "yes" by default. Then, Index Server is a component of IIS, also installed by default (default choice: yes).
It was Index Server, not IIS, that was attacked by Code Red.
Re:I'm a heretic, baby (Score:2)
This is a futile argument. Linux is not inherently more secure than NT and NT is not inherently more secure than Linux. OOTB they both have to be considered insecure, maybe not today, but there's going to be a wu-ftpd, iis, bind, or heaven forbid, sshd exploit after release.
Listen up people, this is important and you will be tested on it at some point: A MACHINE IS ONLY AS SECURE AS IT'S ADMIN IS VIGILANT! Your machines are not secure today. They can be compromised. Someone may not have discovered the vulnerability yet, but they will.
They ALL Suck (Score:3, Informative)
Windows (NT/2000) has some good security features in the kernel, the problem is that they are not properly used by the operating system as distributed by Microsoft. Locking things down would break too much stuff.
UNIX/Linux has an archaic security model that hasn't changed in decades.
Both operating systems suffer from being implemented in C, an unsafe language. It is possible to write secure code in C, but most people have neither the expertise nor time to do it correctly.
Re:Microsoft + Worm = MCSE ? (Score:2)
The use of ACLs on NT makes the security much more configurable than the simple user/group permissions on most variants of UNIX. Some Unices have ACLs, but that's hardly designed from the ground up is it?
Re:Microsoft + Worm = MCSE ? (Score:2)
The lead architect for Windows NT was Dave Cutler who was the lead architect on VMS, which had all the features you list for UNIX long before UNIX did.
Virtual memory, shared object libraries, system level ACLs all appeared on VMS many years before UNIX.
Also part of the Microsoft team was Butler Lampson who invented the security monitor, ACLs and much of the rest of the security infrastructure we take for granted.
Windows NT does not and never has shared code with DOS. The Windows GUI code and some of the libraries are shared from 95 on, but the code was developed from scratch for the purpose.
Networking and security are both relatively recent additions to UNIX. Until Sun wrote NFS UNIX did not have anything like the VMS cluster concept. And NFS sucked real bad until about five years ago. Until five years ago at least one major UNIX vendor was shipping a version of Sendmail that had major security holes in it that had been known for three years.
In short, until Windows NT and Linux showed up to give the complacent UNIX vendors some competition UNIX was a real sucky operating system, and an expensive one at that.
Re:Microsoft + Worm = MCSE ? (Score:2, Informative)
I once had an MCSE ask me, in all seriousness, why he couldn't type a fully-qualified hostname to choose a DNS server. It's a paper qualification; it implies no real skill or insight into the system's operation, or any sort of reasoning into consequences of limited design.
The Microsoft Certfied Systems Engineer certification does not claim to certify any knowledge of planning, implementing, configuring, or supporting DNS.
It tests a limited and well defined check list of skills, most having to do with installation and configuration. Only with the Windows 2000 series did the tests begin to measure planning and design skills.
The Windows 2000 and XP/.NET required tests - and the skills measured by each - are listed here:u lt.asp?PageID=mcp&PageCall=requirements&SubSite=ce rt/mcse&AnnMenu=mcse
http://www.microsoft.com/trainingandservices/defa
Re:Microsoft + Worm = MCSE ? (Score:2, Insightful)
This is limited to MCSE's only? No other subset of users can make this kind of error?
Therefore, I consider MS fanatics to be, for the most part, a self-limiting reaction
What is a MS Fanatic? Is that anything like a Linux fanatic? I don't see many people saying "Screw RedHat, screw FreeBSD, MICROSOFT RULES!". On the contrary, I see a LOT of OS bigotry from self-proclaimed *nix professionals, who naysay and poo-poo an operating system just because it comes from a particular vendor. A true professional evaluates the problem, and figures out what OS/software best fits the situation. There has been plenty of times that we've thrown out Solaris/SCO/Linux in favor of Windows, because Windows offered the best solution for what we were doing.
I think the more relevent question is with regards to the operating system's track record. With the exception of the recent blight of Red Hat 7.0, Linux has probably had far less documented bugs, and because of the UNIX user permissions model, the damages are minimum.
Your analysis is flawed. Willie Sutton robbed banks because that's where the money is. Microsoft OS's get so much focus because they're so widely used. The recent slew of RedHat hacks that have emerged is due to the RedHat distro being the most popular. It follows that a popular OS is going to get attention. NT/2k also has a user permission system. I'm sure any professional who has worked with NT before would be aware of this. When the permissions are applied as documented and recommended by Microsoft the damages are as minimal as on a Unix sysytem.
Compare this to Windows. Bugs all over the place, some more serious than those in Linux, some less serious.
That's a highly astute observation there. Tell me, can a bug in Windows be of equal seriousness as a bug in Linux? I see an awful lot of exploits for Linux. Can you back up your claim of "bugs all over the place" for Windows with any kind of numbers, or are you just speaking from the heart? Linux certainly has a pretty good library of bugs and exploits.
Where most machines are running 9x/Me with *no* user/process security whatsoever, malicious code can run rampant
Actually, ALL Win9x/ME machines have no user process security. But those OS's weren't designed to have that. If you want user process security, use NT/Win2k. 9x/ME were designed as a consumer platform, not for business. Microsoft doesn't recommend using Win9x the corporate environment.
NT/2000 is an improvement, but it's not designed into every aspect of the operating system's historical architecture.
Actually, it is. You're arguing from a point of igonrance. Try actually USING the operating system for a while, for something other then launching telnet. All processes in NT/Win2k run under the contect of the user that spawned it.
Windows has been one patch to DOS 1.0 after another, and the final result is such a kludge and so many processes are running with full administrative priviledges that the task of exploiting a bug remains trivial.
This is bullshit again. If you have so many processes launching under Administrator, I would suggest not having your services run under that account, and stop logging in as Administrator on your system. Do you log in as root on your Unix systems regularly? Best practices for both OS's say not to use root/Administrator unless something calls for special permission that superuser account has.
Running Windows 2000 on my desktop is farcical - half my software won't work properly if I don't give my user account admin priviledges.
Bullshit again. Normal client software doesn't require Administrator access to run. Installing software on a Win2k/NT box requires superuser permissions, but HEY! That's a security feature, and Windows doesn't have that, right? Lazy people who don't want to configure they systems properly run their services under a superuser account, and we all know what THAT means. Even in a Linux world. I certainly don't need Administrator permission to launch Office, Explorer, or any other normal user process. Unless your system is SO badly configured, a user started process CANNOT just run as Administrator simply because it wants to, unless it's a service which has been configured to run as Administrator (in which it's your fault for doing so), or you're logged in as Administrator.
It amazes me how many allegedly Windows 2000 compatible programs decide that they're going to attempt to store temporary information in the system registry instead of the roving user registries.
Because software installed on a Windows sytem is system-wide. If you want to prevent someone from launching a particular application, use POLEDIT and edit their profile to stop them, or *GOSH* maybe change the NTFS permissions to prevent someone from accessing the executable? Don't tell me that you don't use chmod in the Unix world?!
The single system registry is dangerous, too. Imagine, in your *NIX
Of course, nobody would expect you to know that you could set permissions on individual Registry keys, and restrict
Contrast this to Linux or any other UNIX variant, the whole model and concept of which was designed with user and process security and isolation from the ground up.
Yeah, fancy that Microsoft wouldn't consider that. I guess the Internet Guest account can launch any damn process it wants, or any user on a Terminal Server can stop any other process, even if it doesn't belong to him. Not. IUSR_ cannot simply just add itself to the Domain Admins group, just because someone is using a directory traversal exploit(which wouldn't be a problem in itself if the admins simply INSTALLED THE DAMN PATCHES) because OH MY GOD! That process cannot be spawned by a non-Administrator account!
As a bonus, the added complexity of administering multiple accounts to the average user is a pain in the butt. They want point-and-drool, everything clean and simple and familiar.
Point-and-drool? Do you really hold your users in such low regard?
Actually, administering a NT/Win2k mixed domain is quite easy, and I use the command line a lot. But you're expecting regular everyday users (who probably just use a PC at home for email and pr0n surfing) to suddenly have knowledge of a 20 year Unix engineer simply because you're in the building. There's no need for GUIs in Linux, no siree. Things line KDE and Gnome are simply figments of my imagination. Windows domains don't require a person to have multiple accounts. Microsoft has stressed from the beginning the "unified login", where one account is sufficient. Sounds like you really need the services of an MCSE.
The beauty of the complexity of Linux/UNIX versus Windows is that it weeds out the chaff who aren't capable of managing a box.
Complexity can come and bite itself in the ass. Is complexity always a good thing? We've chucked out Linux and Unix solutions in favor of Windows simply because it Didn't Work. Linux isn't the Wonder Platform that a lot of people try to make it out to be.
I'm sure the programmers and architects at M$ see the problems and comparisons I'm drawing. To be designing an operating system, you must love computers and a sense of a job well done, so I'm sure it pains them that they have to deal with such kludges day in and day out. I'm sure they'd dump the whole thing and fix it if they could, but the marketing guys won't let them implement it.
I hope you're sending your resume to Microsoft right after reading this. Actually, I don't, since you haven't the first clue about Windows or its security model. Instead of the usual Windows-bashing, why not take a few minutes out of the day and actually LEARN the OS? It sounds like your workstation needs to be reconfigured anyway.
I've administered many Windows domains, both NT and Win2k, that are directly connected to the Internet, and have a large internal userbase. And I've never ONCE had any major security problems. Maybe I'm a "gifted" MCSE, or The One who will bring balance to the Force, but to me, none of your arguments hold water.
Re:Microsoft + Worm = MCSE ? (Score:2)
The interesting thing about the article is that it implies that unix worms are written by very smart people, unlike the script kiddies who target windows. Maybe this means it's a bit harder to write a unix worm?
I would think that to write one which would propagate despite the myrid configuration options in UNIX which simply aren't available in Windows, as well as having to find a way for the malicious code to break out of the process' user rights and get root access, would substantially raise the bar in any attempt to make one that is substantially destructive.
Re:Regardless (Score:3, Interesting)
From the article: "A worst case Warhol Worm is truly frightening, capable of doing many billions of dollars in real damage and disruption. Since it can achieve complete spread in well under an hour, and could begin doing damage immediately on infecting a machine, human mediated responses offer almost no hope of stopping it. "
Complete spread in under an hour! Total destruction of infected servers!
Whee!
Watch for one of these coming out with the next major IIS exploit.
Re:Regardless (Score:3, Interesting)
When IBM sprayed SF sidewalks with Linux graffiti (some is still there)
Re:Linux antivirus software (Score:2)
Generally, the UNIX biodiversity has helped prevent viruses from spreading, until "here! run this perl script!" catches on. Right now there aren't any non-proof-of-concept Linux viruses.
Any day now... (Score:3, Funny)
I can just see it:
Hi! How are you?
I send you this perl script that must be run as root in order to have your advice
See you later. Thanks
Re:Linux antivirus software (Score:2, Funny)
#cat wrightAntiVirus
find $1 $2 $3 -iname \*.exe -or -iname \*.doc -or -iname \*.xls -ok
Pre-emptive move: Cascaded DDoS prevention (Score:2)
Default services (Score:2)
For example, Apple's Mac OS-X disables ALL remote services (apache, ftp, ssh deamon, AppleTalk sharing, etc) by default during install. And it's not possible to turn those on during install either, you have to go into System Properties (under an admin-enabled user after install is complete) to switch them on.
Mandrake linux (I'm sure other distros do this, but Mandrake is the only one I've ever installed, likewise to other unix-based OSs) takes a similar approach. While it is possible to choose certain services to open remote services at install time, there is a screen during install which advises you that you're allowing certain daemons to be enabled at install, and an oppurtunity to turn them off. Not the best way, but it's an improvement over MS.
The idea with both of these is that you are explicitly telling the OS to open services, as opposed to IIS which you are telling Windows to run implicitly by taking a default install. This allows an admin the ability to know exactly what services are running on a machine, as opposed to someone not knowing IIS even exists on their machine.
Re:Secure by Default (Score:2, Informative)
I use OS-X at work for networks research. I have a PowerBook G4 laptop w/ dual monitors (a regular monitor + the laptop screen), 500 MHz, 256 MB ram, 20 GB HD, 10/100 ethernet, 2 USB ports, 1 firewire port, 56K modem (which is thus far unused).
if you want to get a powerbook, wait about a month. OS-X.1 is in beta, and is expected in September. I work a company Apple considers a "Primier Developer," hence we get pre-releases and betas and all the other good stuff, and X.1 delivers on what it promises. X.1 makes a ton of serious improvements over X.0.4, the current patch. They made a lot of improvements to the GUI allowing the OS and programs running on it to be more responsive to user interactions. Plus several other enhancements like DVD support (which I have not yet tried)
Re:Cmdr Taco? (Score:2, Informative)
Re:Cmdr Taco? (Score:2)
Re:At least with unix... (Score:2, Insightful)
Re:As long as nobody builds the perfect worm... (Score:2)
Re:Let's also not forget (Score:2, Informative)
I forget the user name, but it's equivelent to nobody on *nix. You have to go screw it up yourself before it runs as root.
If you're gonna spread FUD, at least get it right!
sadmind is Solaris (Score:2)
Re:Worms known before 1988 (Score:2)
Yeah, I've had multiple e-mails on the subject of "there were worms before the Morris worm" but what I'd intended to say (unfortunately not what I wrote) is that the Morris worm was the first Internet worm.
Mea culpa
Re:But But... -- But No Cigar (Score:2, Insightful)
This worm comes down to laziness, no more no less. I'm betting that, at the absolute most, between 5% and 10% of sites need things like
Re:But But... (Score:2)
Of course I use tarballs too.... But RPMs make the package management a little easier and avoid the --force command later....
Re:Duh. (Score:3, Insightful)