Security Through Obsolescence 322
dlur writes "This article and this article (both variations of the same article written by roblimo) delve past security through obscurity, into using old, out of date software to secure a site. Maybe it's not always in your best interest to snag the latest kernel? Perhaps think twice before jumping at the chance to buy MS's latest OS."
This is great! (Score:5, Funny)
Re:This is great! (Score:3, Funny)
Re:This is great! (Score:3, Funny)
As you glide in, guns blazing, your last thought, as your body is charred by dragon flame, is that you should have remebered that dragons (at least good ones) have thick armoured scales.
I wonder if there is something useful I can do with metal lump that is left from the frame of your melted hang-glider. Perhaps I can setup an old AIX server to hand out some simple web pages...
Re:This is great! (Score:2)
- .50BMG
'Nuff said.Re:This is great! (Score:3, Funny)
Silver
Just Obscurity, not Security (Score:5, Insightful)
Without the script kiddies, you still have to worry about serious crack attempts. By using antique software, it is probably relatively easy to do some research and find security vulnerabilities.
No it's not (Score:3, Insightful)
They've got the argument all wrong - it's not more secure because it's obscure - it's more secure because older software has been around longer, and the kiddies have already found the obvious bugs and they've been patched.
Would you run a 2.5 kernel on a computer where you worried about security? I'd hope not.
Re:No it's not (Score:5, Insightful)
You may want to rephrase that statement and maybe say "because older linux kernels have been around longer"
your man is an idiot, but this is what they mean: (Score:4, Interesting)
Mac OS 8.6 was *THE* standard before 9 and X. More stable, better for the environment, better for the economy, etc. etc.
There was a free upgrade available everywhere to get you from 8.5 to 8.6. Yet two years ago I ran 8.5 for a year and a half.
Why? DIDN'T need to upgrade. It gave me everything I needed, didn't crash out* (I had 1 or 2 problems with ProTools, but it was an anomaly) , and I didn't need USB support.
My system was set up in such a way that everything, CDEV's, INIT's, and all extensions got along with each other and the only time I had to reboot was when I wanted to turn my computer off.
To extend this, if you have a set up that has had the HECK tested out of it, stands up to "attack" (whether that means a "hack" for an network box, or a heavy load for a server) and doesn't give you problems, why re-invent the wheel?
Re:Just Obscurity, not Security (Score:5, Interesting)
You lightly touched on one of the biggest vulnerabilities to any system: Consistancy. If you can research an OS, you can find out how to break in.
What about a case where somebody builds their own OS and runs their apps on it? (I realize that is extremely unlikely, so use your imagination a bit...) How would a would-be hacker get into that? I'm sure it's possible, but without a model to work from, how would they know what to do?
My company used to run IIS. When we got hit with Nimda, I noticed that 'CMD.exe' was getting called a lot. What'd I do? I renamed CMD.exe and replaced it with Calc.exe. I had originally intended to write my own VB App that'd notify me if it was ever ran. Never got around to it, though. Essentially, I hid a commonly known function of WinNT. Anybody breaking into the system would have to figure out what I did since it's no longer the same type of server other people run.
It is for this reason I'm really interested in Linux as server. If I were to get really deep down into the nitty gritty, I could make the OS so unfamiliar that only the most determined hacker would get in.
Re:Just Obscurity, not Security (Score:2)
That's absolutely right, and one of the huge advantages that Linux has over any MS products in that you can configure in or out any options you want at time of install. Even post-install, the kernel and every type of service is yours to do with as you see fit.
Re:Just Obscurity, not Security (Score:3, Informative)
Re:Just Obscurity, not Security (Score:3, Informative)
Re:Just Obscurity, not Security (Score:3, Interesting)
The box survived three or for "security tests" from consulting terms with no issues. It was one of three NT machines in the company that survived, the other two of which were actually OS/2 boxes running SAMBA.
Just goes to show...
-WS
Re:Just Obscurity, not Security (Score:3, Interesting)
You don't need the source code, you just need to have Windows. Understanding of Windows features leads to understanding of how to be annoying to other Windows user.
On the other hand, I would have no idea how to attack a Linux user. If I were to get familiar with Linux, I could start to cook up ideas.
Re:Just Obscurity, not Security (Score:2, Informative)
Re:Just Obscurity, not Security (Score:3, Interesting)
Re:Just Obscurity, not Security (Score:2, Interesting)
It's a little like locking your door. You won't stop the rare really skilled thief with a lockpick, but it'll deter all the guys who go around trying every doorknob.
Re:Just Obscurity, not Security (Score:2)
> \rm
>
> alias
>
You get the idea - it would annoy them, but not for long. You might rename the binaries, but you'd have to reengineer all your system scripts.
Fun though - might edit some rc files next time I see someone leave their terminal logged in. :)
Re:Obscurity can offer Security (Score:2)
It seems to me that having the source code of an OS or a product might reveal potential exploits, but the most damaging attacks we've seen so far seem to be from exploitation of features designed by a company trying to be more enticing. I can understand the negativity towards your concern, however I do not condone it.
You are right in that the more you know about a program deep down, the more capable you are of damaging it.
Re:ah, but "root" not required (Score:2)
Run an intrusion detection program [tripwire.com] from a physically remote computer. Such a program compares a snapshot of the system (stored on the remote computer) to the current system. A reinstallation will be detected and reported. In order to defeat this system, the intruder needs to physically compromise two machines at once. You can even set up intrusion detection from several remote mahines to guarantee that physical access isn't a risk. Problem solved.
Frankly, I don't see how your "source modification and reinstallation" attack is a risk specific to open source systems. There are utilities that can accomplish the sort of things you're talking about without modification of source code, and if an attacker has physical access to a machine, they'll be able to get in regardless of what OS you're running.
Re:ah, but "root" not required (Score:2)
I think the point here is that management must be aware of the risks. And, the risks are the same with the OS code as they are with custom applications. Except that everyone has the generally available source code for the OS but not your key custom apps.
It is specific to open source simply because the source is what is changed. It provides the mechanism.
The mistake is to falsely assume that risks are identical when they clearly are not. Physical access does not equate the risk between having the source code and not having it. That is no more true that telling a support person they do not need the source code to fix a bug since you gave them unrestricted access to the hardware.
The code is essential to fix bugs in key custom applications. And, the code makes it much easier to modify the OS. Any hacker knows that for a fact.
Yes, open and closed systems can both be attacked. But, the risks are not the same. The methods are not the same. And, the management that must be imployed also must take the differences into account.
As I suggested above, one way is to customize the OS and put the source code for that custom version in your vault.
In other words, take an open source OS, modify it and add the value of obscurity (for what it is worth) by controlling access to the code that was used to build the OS you run.
It is not one or the other as many like to paint it. You can have both. Take the base distro that benefits from open source, customize it enough to make it "different" in key ways and then place your source into a secured system thereby gaining from the obscurity you placed on your version.
That is one thing you can do with open source that you can not do with binaries. Oh sure, you can do some things with just the binaries. But, that is no fun, not easy and has significant limits.
Many who prefer open source do so because of how they can customize it. And, that is a primary benefit for many installations. The point being made here is that the process can enhance security as well. And, part of that enhancement is the obscurity you can place on your version.
I think people have to forget this idea that security can not be enhanced via obscurity. Clearly it can. But, that does not suggest that closed source systems are more or less secure at all.
What do you do with the source code for your key custom applications? Do you publish them? If your stuff is written under the GPL and you do distribute them, then perhaps you have to. But, if you do not distribute them, I doubt anyone publishes the code just "because". At least not if those are for key custom applications. People are not going to walk in off the street and advise you your code has a bug in it.
There is a difference between distributing something under the GPL and gaining the benefits because of doing that; and not publishing the source code for key applications. And, in my book an OS running on my key systems is a key "application".
So, the equation is between the source of your custom apps and the source of your OS not between open and closed source for operating systems. There is a big difference between open and closed source for an OS.
Re:ah, but "root" not required (Score:2)
My interpretation of your original argument is as follows: since code is available for open source operating systems, hackers can modify this code and install the modified code in systems they've accessed (physically or remotely) to gain information about the system and network. This is a classic trojan attack, and it's been implemented against a wide array of operating systems, closed and open source. Closed source utilities are typically modified via standard reverse engineering techniques. It is more tricky to modify a utility that you don't have the source to, but not significantly more tricky. Remember, modifications to an open source utility have to keep that utility working and compatible, which can be a nontrivial engineering problem. Plus, there are plenty of ready-to-install trojans out there for all sorts of operating systems. A good sysadmin will guard against trojan attacks by running an intrusion detector.
Also, are you arguing that open source has an advantage because a sysadmin deploying an open OS can gain some obscurity advantage by altering the system code and recompiling? This might be possible, but it would be stupid. The slight advantage gained in having a slightly different OS would be overweighed by the loss of the support of developers on the main code branch. Bugs that you introduce in modifying the OS don't get fixed, and fixes for existing bugs are no longer compatible with your modified OS.
I would argue that from a security point of view, the main difference in open and closed source comes from the development process. Many eyes, shallow bugs, and all that.
Re:ah, but "root" not required (Score:2)
Replacing logons, installing key loggers, etc, is not difficult to do on any open or closed source OS I am familiar with (various commercial unixes and MS NT family) given you have physical or administrator access to the machine.
How do you know when your closed source OS has been modified/replaces/etc without your approval?
HTH.
Re:...nothing to do with open/closed source, then (Score:2)
Think about it another way, does the open or closed source nature of software affect in any way the distribution or propogation of viruses?
Security if YOU own the source code (Score:2)
I understand that this is an expensive proposition, however this is what we do where I work.
This way any bugs/exploits can still be researched and fixed by the good guys, and the bad guys are just shooting in the dark.
Not that we intended to have all of our COTS (Commercial Off The Shelf) to go end of life, but you make do!
However when UK air traffic goes down for a few hours and the only developer who knows the product is in hawaii for two weeks on his honeymoon (yep. That was me.) you have a problem!
Re:Just Obscurity, not Security (Score:2)
So you think the 'script kiddies' can easily hack say, VM/CMS or MVS? Just do some simple research and plow through the manuals? Ever read those manuals? It's like sanscrit, not simple cook-book stuff. JCL makes perl look like english.
These old systems can certainly be hacked, but not trivially. Go look on the net and you will find very little on hacking these types of systems. Then look for info on hacking Unix and especially Windows.
=brian
Re:Just Obscurity, not Security (Score:2)
On that note, Win9x isn't that bad, out-of-the-box and run by an end-user who isn't an idiot (read: doesn't install the "latest and greatest" spyware and trojans).
Think about it - a Win9x box, out-of-the-box, doesn't really run any services. As long as file/print sharing is turned off (and the firewall should be blocking these ports anyways), there's no IIS to 0wn via Code Red or Nimda, you don't have to install Outbreak Excess, and you can always install Nutscrape or Opera instead of IE.
Would I use such a box as a server? Never. But for a basic web-surfing and gaming box, why not?
The real point about security-through-obsolescence is that the crackers upon whom the skr1pt k1ddiez depend aren't actively looking for new 95, 98, or 98SE 'sploits, because it's no longer the cool thing to crack.
AIX old and obscure? (Score:2, Interesting)
Um, I don't know about you but last time I checked, AIX is far more capable than most UN*Xs out there at just about everything.
By no means is it "old" or "outdated."
Re:AIX old and obscure? (Score:2, Interesting)
Re:AIX old and obscure? (Score:2)
Cuz I though aix was the dot in dot com (or com dot, whatever)
Just introducing new problems (Score:4, Insightful)
Re:Just introducing new problems (Score:2)
On the other hand, depending on the product, the newer versions are the security patches, so ultimately you do end up upgrading by following this logic.
Best course? Somewhere in the middle. If you're interested in security, stay off the cutting edge, but don't run something so far back that it's been superceded by newer versions.
Re:Just introducing new problems (Score:2)
3rd parties will pick up the slack.
Nice points but... (Score:5, Insightful)
The main problem is that most vendors stop supporting old products. This creates a huge security threat. Just because no one knows about security holes don't mean they exist.
Sure you've eliminated probably 99% of all script kiddie threats and if that's the only threat you can identify then by all means this is a cute idea. However, as security administrator at my company I do my best to secure against any and all threats which means I must presume that old versions of Solaris (for example) have gaping security holes that were never fixed and therefore running the leatest and greatest with all applied security patches and a rock hard configuration is my best bet when it comes to security.
Roblimo's friend does have a point, though regarding Macs. Old Mac's are really the most secure systems out there. Simply because they can't really do much. They weren't designed to be networked and so there aren't any services to exploit
--
Garett
Re:Nice points but... (Score:2)
err... doesn't mean they do not exist.
--
Garett
Macs WERE meant to be networked! (Score:2, Insightful)
You can run appletalk on ip.
Re:Macs WERE meant to be networked! (Score:2)
What I meant to say was that Mac's weren't designed to run services like http, smtp, dns etc.
They were made for home users and that's it. Any networking capability that they had was solely to either make home users lives easier or for some cash (like selling to schools where networking is important).
--
Garett
NASA follows the same logic. (Score:2, Insightful)
Besides, when was the last time you moved a production (IE: business-oriented box) to the -CURRENT tree on anything?
Re:NASA follows the same logic. (Score:4, Informative)
Re:NASA follows the same logic. (Score:2)
Re:NASA follows the same logic. (Score:2)
Re:NASA follows the same logic. (Score:2)
Debian (Score:3, Funny)
Isn't that how Debian works anyway?
Benefits to running older software (Score:3, Interesting)
The down side is that if a problem does emerge, there's not a lot you can do if the vendor stops maintaining it. However, for critical infrastructure like routers, vendors typically keep old releases alive for a long length of time and continue to release updates to the old branches.
I've been doing this for years (Score:2, Funny)
I owe it all to the fact that nobody knows enough about the software to hax0r into it and steal my personal data.
HA! (Score:2, Funny)
Understatement (Score:2, Funny)
Re:Understatement (Score:2)
Or maybe "Windows for Dummies"... but then, that's redundant! *ba-dum* Thank you, I'm here all week.
Gopher (Score:5, Funny)
Re:Gopher (Score:2)
Just don't use Internet Explorer to view your collection. [slashdot.org]
OMG! This is fantastic! (Score:2, Funny)
Re:OMG! This is fantastic! (Score:2)
No, man, throw it in your kitchen and make it a VaxBar [arizona.edu].
Old systems reduce the field of possible crackers (Score:2, Interesting)
Fort Knox; aka MS-DOS (Score:5, Interesting)
Re:Fort Knox; aka MS-DOS (Score:2, Informative)
Besides, exploing a buffer overflow could allow the attacker to upload some code that would overwrite memory with the contents of some special packets. The attacker could even install another OS over the net this way :-)
Re:Fort Knox; aka MS-DOS (Score:5, Funny)
Cool... just what everyone needs... a single-user, single-tasking firewall.
Why not call it a brick-wall?
Re:Fort Knox; aka MS-DOS (Score:2)
Re:Fort Knox; aka MS-DOS (Score:2)
What do you think you're doing? (Score:2)
Sounds like the ATC's position... (Score:3, Informative)
Of course, a system like this is still subject to physical abuse, and an old system that is broken into pieces is just as bad as a new system that is the subject of a DoS....
Re: (Score:2)
Dammit, don't let the secret out! (Score:3, Funny)
Not to mention that the laptops we ship the DOS software on gets stolen a lot less frequently, since our DOS software will run on 286s...
Old software. (Score:2)
Of course, the flip side would be that the whole OS is toast as soon as a vulnerability is found. Hell, Apple won't admit they even _made_ A/UX any more.
--saint
(Seriously. Try to find it on their site. You'll find Newton stuff first.)
What I Don't Get. (Score:2)
So what happens if there are alot of webservers, etc out there who run obsolete software for this very reason? Hackers don't exploit a particular OS, webserver, etc just because it's new, they also do it because that particular flavor is popular as well.
Even if the software is old by today's standards, rest assured, as long as it's running on alot of servers and PCs, it'll still get attacked.
On another note, I agree with the aspect that when a particular OS/software is out in the "wild" for a long time, it gets scoured for weaknesses and gets patched accordingly. Eventually the OS/software becomes robust and secure over time. In the end it's no so much that it's new, but that its strong and secure. And that's what matters the most.
Hello? Sperry Rand? (Score:2)
A UNIVAC I? Mmmmmm, mercury delay line storage, 500 microsecond memory speed, and 5,600 tubes. What more could I ask for!
Ahh, I see. (Score:2)
hmm,
-Pete
MVS - S360 - S370 - S390 - ZSeries (Score:5, Interesting)
This article just goes to show that good security is hard, and is often an afterthought.
Re:MVS - S360 - S370 - S390 - ZSeries (Score:2)
Serendipitous advertisement (Score:2, Funny)
The guy with the C64 webserver was right (Score:2, Funny)
Previously discussed on slashdot back here [slashdot.org]
CP/M to foil Unix hackers (Score:2)
Re:CP/M to foil Unix hackers (Score:2, Insightful)
We used to have fire but the inventor died (Score:2)
This is the flip side of saying non disclosure is more secure than disclosure. Obsolete means nobody knows about it whether anyone gives a shit about it or not is a different question.If we had all sorts of PDP-11's around here or Link analog computers I'm sure that eventually someone would break them just because they're there.
Flawed.. (Score:5, Interesting)
I believe there is a little bit of confusion in this article between obscurity in the sense of software not being widely used and obscurity in the sense of proprietary closed-source software. There is also the confusion of software _differences_, which the author of this article bungles together with software age. In any case, this article is seriously misguided. Let me explain:
There is an Object. It could be your physical hardware, your OS, or simply a version of a software package. Imagine two generic Objects, Object-A and Object-B, exact in every practical way. Now imagine an Exploit that works on Object-A (and a cracker has access to this object). It also works on Object-B (your object) because they are identical. Now imagine there is an Object-C. It is very similar to Object-A and B, but has a few slight differences. Now the Exploit will need to change to accomodate this. This is _security_. This is the same principle viruses (biological or computer) work on. The differences between objects makes them secure. The less difference, the less secure. Think of any *ix security measure. Passwords, for instance, are simply ~8 character differences (and a login name) between one *ix and the next. Attempting to break a password by trial-and-error is impractical. Crackers rely on this principle of _similarity_ of systems to break passwords. They download a system's password file and use a "word file" to crack passwords. This word file is merely commonly used passwords--again, the principle of similarity. Most *ix systems have a password file in a common format and there are common passwords. Common system properties (/etc/passwd, etc.) + common user psychology turns what is a very secure method (passwords) into a very insecure method. One small admin. change could make the difference between a system being cracked or not (such as moving daemons to a "strange" location or partition, etc.).
Software age has nothing to do with security. The article really has many seperate issues tied together and it really is not a good idea to just use older software for security sake.
My new filing technique is unstoppable! (Score:4, Funny)
I have no network. My backups are stored on 5 1/4" floppies.
Not only can no one read these things, they'd need a truck convoy to haul them away. No way in hell they're sneaking past security with a motherfucking semi truck!
Well, maybe (Score:2)
Most people will know this, but I have to quote Jamie Zawinski [google.ca]: But as we all know, Linux is only free if your time has no value, and I find that my time is better spent doing things other than the endless moving-target-upgrade dance...
I could be wrong, but the knowledge and practical experience needed to try something like this looks to be of little worth to the people who'd want to do it.
Security through obscurity (Score:3, Insightful)
This is a good example of security through obscurity, particularly the MacOS example in the article. Obscurity is no basis for a security model, but a little obscurity thrown in on top of some real security can't hurt.
For example, a tech I know runs a MySQL server that shouldn't be exposed to the outside world. It's behind a firewall and the port is blocked, fine. It's also run on a non-standard port. Why? Because if somebody cracks the main network, they still have some work to do to get to find the MySQL server. That's time to discover the intrusion and fix the leak.
Summary: Security through obscurity: bad. Security + obscurity: good.
Re:Security through obscurity - BAD (Score:2)
There is no such thing as a "good" example of security through obscurity.
The biggest problem with security through obscurity is not that it doesn't provide security (although this is one of the problems), but that it provides a false sense of security.
if somebody cracks the main network, they still have some work to do to get to find the MySQL server. That's time to discover the intrusion and fix the leak.
This is a perfect example of the problem with it.
Your friend probably thinks that the "non-standard port" thing is pretty clever, and that it gives him time - he thinks that he's done something to secure his network, when in reality he hasn't; the system is just as vulnerable as it was before he moved the port, but he believes that it's more secure. This is hubris at it's worst.
Incidentally, using old software is not necessarily obscurity - in general, older software has fewer features, fewer lines of code, so therefore fewer potential bugs.. fewer bugs means fewer potential security problems.
Security through Maturity (Score:3, Insightful)
This is not to say that OS and software companies do not try to thoroughly test their software. They do. But even in the largest, most sophisticated test lab, one cannot recreate all of the possible conditions that will be revealed when the software is released into the real world.
The reasons older (obsolete) software may be more secure are really two fold. Older software, due to creaping featurism which haunts all software development activities adds features, which adds chances for security holes and errors. I assert the increased features, and espicially increased interfaces (user, programmatic and otherwise) increases the likelyness of security issues. The second issue with older (obsolete) software is that it is more mature. Please understand this carefully- older software that has been patched ot the current patch level will be more secure than software that has not been patched.
I think equating obsolete software with security is quite a stretch. I do agree with the thought that mature software will have fewer security issues. Added to this the fewer interfaces on older software gives it a greater chance to be free from security issues.
-tpg.
Security through variety (Score:2)
What it does not stop is those who live off hand-me-downs. My experience with a pentium 200 is that it's not much fun browsing the web with it.
The rule of affordance states that locks are meant to be picked.
We do exactly that! (Score:2)
Look at crypto. (Score:4, Informative)
This seems like a much better model for OS development than "let's hope no one remembers that old trick".
=brian
security through obsolescence (Score:2, Insightful)
Put it in the compiler? (Score:2)
Wonderful idea. Pseudo-rant ahead. (Score:2)
Oh, and occasionally development occurs only because of a serious exploit that requires immediate attention. Let's install BIND 8.0, hoping that the script kiddies will not observe this blatant error, oblivious to the fact that experienced (cr|h)ackers would perceive exploiting such an application or operating system a trivial activity.
This concept is nothing more than an esoteric form of "security by obscurity." It disappoints me that the Slashdot editors would begin to advertise such a blatantly rhetorical and poor security practice.
There's this thing called the Internet (Score:2, Insightful)
It's a similar problem to that faced by music companies trying to copy-protect CDs -- all it takes is for ONE person to rip the protected CD, then it's out there.
Proves an old adage... (Score:2)
YMMV - Re:Windows 2000 (Score:5, Informative)
And this explains why my honeypot with w2k server default install is compromised in an average 15 minutes?
Lets see, currently netcraft reports IIS represents 27% of active sites against Apache at 63% and IIS machines still to my knowlege represent 70+ % of the defacements.
Gartner recently advised that due to the continued problems with IIS, organizations should stop deploying / replace IIS.
Re:YMMV - Re:Windows 2000 (Score:2)
Your argument makes no sense whatsoever.
I've deployed IIS on several different occasions and have had no problems so far. Maybe if you actually read the freakin manuals, you'll be able to have a decent server.
Re:YMMV - Re:Windows 2000 (Score:2)
Properly hardened (and Microsoft provides the tools to do it) a Win2k is damn near impenetrable, certainly on part with a *nix box.
However, too many people throw up a Win2k box without bothering to lock it down. That is where the problem comes from. And if someone says that *nix out of the box is nearly impenetrable, they're wrong (OpenBSD being the only exception I know of). If there's a port open to the world, it can be hacked. A number of *nix servers have a bunch of stuff open by default, just like Win2k.
It's the admin's fault, not the OS's. Don't confuse the two.
Oh, and if someone doesn't believe me about the tools, look up iislockd on the Microsoft site. Great tool that will just about completely lock down IIS acording to parameters you provide.
Re:YMMV - Re:Windows 2000 (Score:2)
Firewalls and Win2K Security (Score:5, Interesting)
Firewalls are a valuable tool, but too often they are used as some form of magic security bullet. This leads to "crispy, chewy" environments - a hardened external-facing posture (crispy) that is wide open internally (chewy center goodness). Once an attacker (external attacker who manages to compromise an internal machine via allowed inbound traffic, disgruntled employee, social engineer, worm / trojan, etc) manages to get inside the firewall, the entire infrastructure is open to attack. This assumes that the firewall itself doesn't fail in an unexpected way, gets compromised, or otherwise falls victim to human error while editing firewall rules.
WinNT/9x fail because of both design and implementation flaws, not because of advancing technology. It might be closer to the truth to say that we have an advancement in the art of information security - advancement in understanding. Unfortunately, not everyone pays attention to these advancements.
WinNT and Win9x are entirely different from each other. WinNT presents a much more secure platform than Win9x. But it does have its own flaws. And some of these same design flaws are also found in Win2k.
We can see one of these design flaws by comparing Win2K to Unix. One of the advantages to Unix systems is its modular architecture. One can harden a Unix system by disabling and removing unneeded components. Win2k is much more difficult as many components are not designed to be removed. On removal, one will discover surprising interdependencies among these and other components. Furthermore, the entire process of hardening and applying service packs and hotfixes must be redone if one adds new components or software to the system as this process tends to undo previous hardening / patching. Within the Unix environment, components are rarely contain unexpected interdependencies, are easily removed, and do not sneak back in during system patching or installation of new services.
While it IS possible to harden a Win2K machine, the process is often difficult and further complicated by the act of maintaining the host. Unix is far easier. And complexity endangers security. This alone puts considerable doubt on the supposed superior security stance of Win2K over Unix.
There are plenty of other issues to consider in "Unix vs. Windows" security comparisons. However, that would push this thread even further off-topic.
Excellent troll. :p (Score:2)
When an open source application is entering development or hasn't yet matured, you're absolutely correct. It is certainly more trivial to detect exploitable bugs in open source software; however, only an emphatic number of them exist. During the phase of intense scrutiny that many open source applications encounter as they mature, most of the exploitable code is eliminated by the developers.
In a closed source application, the bugs do exist, but are obscured temporarily by the intentionally inaccessible code. They are, at one point in time or another, located by an individual who dedicates the amount of time and resources required to discover the nearly enigmatic exploits.
Closed source applications delay the inevitable, whereas open source applications simply permit the inevitable to occur during their various phases of maturation.
Re:My problems with this article... (Score:2)
Well, if you run off a system like early Mac OS, where getting a comand line is nigh-impossible, especially remotely, then you're in a pretty good spot, no?
Re:riiiight..... (Score:2)
Re:riiiight..... (Score:2)
Re:Does not apply for MS products (Score:2)
Amusing, eh?
Re:Does not apply for MS products (Score:2)
Maybe now that MS claimes that they are focusing on security they will start SHIPPING new products locked down.
Re:Does not apply for MS products (Score:2)
MS has something called the "ISS Lockdown tool" that basically does what you described. I ran that during the whole Nimda thing and it basically turned off all the features of IIS.
Then I used 'URLScan' (also an MS tool, I think... hopefully I didn't miscredit it) which is a filter for the URL before it gets past IIS. You could filter things like 'CMD.EXE', and so on.
After I installed both those patches, I felt pretty comfortable with the webserver. It's been up for months now without a single fault. I don't know about the rest of you, but I'm surprised. Heh.