Justifying the Common Criteria Security Evaluation 168
lewko writes "Microsoft has just received a Common Criteria certification for Windows 2000 at Evaluation Assurance Level (EAL) 4. Security experts have been saying for years that the the security of the Windows family of products is hopelessly inadequate. Now there is a rigorous government certification confirming this. What does it all mean? This paper suggests that Microsoft spent millions of dollars producing documentation that shows that Windows 2000 meets an inadequate set of requirements, and that you can have reasonably strong confidence that this is the case. Microsoft bashing aside, the process in evaluating a security product is relevant to anyone considering the deployment of technology into their environment." The EROS operating systems he mentions looks interesting - of course, it also looked interesting three years ago.
What's secure? (Score:4, Interesting)
What did Linux get? (Score:2, Interesting)
Re:What did Linux get? (Score:5, Interesting)
One example that immediately comes to mind is that "ps" listings can't show other users' processes. Many of the C2 requirements are kind of like that.
Re:What did Linux get? (Score:3, Interesting)
I don't consider ISO/IEC15408 machines a burden, especially in lieu of the alternatives; most user-level programs may never tell they are there. As Linux is source, it is trivial (well, insofar as kernel hacking is trivial; maybe 'possible' is a better word) to acquire the necessary options for ISO/IEC15408. It seems easier to do this with BSD's because they tend to be simpler in design.
Re:What did Linux get? (Score:2)
You are correct in the auditing claim. I suggest you read select sections 4 through 12 of version 2.1 of the CCITA part 2, available at:
http://www.commoncriteria.org/cc/cc.html
Most pertinent of these, section 6 "User data protection", will qualify, upon investigation, the statement I have made regarding the necessity of ACLs, sudo, non-rooting boot params, non-transferrable FS. (see: FDP_{ACC,ACF,SDI}, and FTP_FLS in S.10.2) Make note that I specified these as sufficient, not necessary.
I do not understand the relevance of the remainder of your comments, except perhaps the BSD references, which are partly agreeable, but I suspect your perspective is biased. Of all the free software developers I have met, Theo de Raadt of OpenBSD fame is among the more finicky. Thankfully his skill and knowledge compensates more than adequately.
Re:What did Linux get? (Score:2, Informative)
I can't see how that would be required for C2 (CAPP in the CC). The old B2 (Structured Protection) was the first level that required covert channel analysis. Granted, that's a pretty obvious covert channel, and you might see it as a kind of quasi-legitimate IPC. In that case the B1 (LSPP) level would require it to follow it's normal rules of compartments and levels.
Re:What did Linux get? (Score:5, Interesting)
I know some commercial Unixes are certified to C2 if you have it configured right. What about the Linuxes?
Glad you asked. Some people might look at the fact that Linux doesn't have a XYZ 'certification' as a indication of that it is not secure enough to get it.
In reality, such certifications cost a lot of money and small companies like RedHat simply can't affort it (They don't make enough money of release X.Y during it's market-life, to justify such a operation)
What is interesting about this new Windows 2000 certification is that it's for a system that operates in a "safe" environment (i.e. not on the Internet) and Microsoft specifically asked, and paid, for grading at this level.
Now, you can interpret that as you want, but most of us are probably understanding it as "This is how secure Microsoft thinks Windows 2000 actually is". (Such gradings take a long time (few years) and I doubt that Microsoft will have another go at a higher grading before the EOPL (end-of-product-life) of Windows 2000.
Re:What did Linux get? (Score:5, Interesting)
In reality, such certifications cost a lot of money and small companies like RedHat simply can't affort it (They don't make enough money of release X.Y during it's market-life, to justify such a operation)
No, Linux would fail evaluation because it does not meet many of the important security requirements. In particular there is no system security guide that describes how to securely configure the O/S in a single place.
Documentation is a large part of the C2 criteria. Linux simply fails that test. You cannot get certification for a third party guide for good reason, the document has not been reviewed by the engineers who wrote the code.
It is interesting to note how the Fox News style bias of slashdot on the security topic gets more hysterical by the month. Could it be because analyst firms like Aberdeen are predicting that Linux will become the poster chid for security [newsfactor.com], and no they don't think it is more secure.
So Microsoft get a security evaluation, the slashdot response is to publish the story three times to date, each time claiming that it is further proof that Microsoft's products are insecure. At what point do people ask whether the Slashdot editorial style has more to do with the commercial interests of their employer than an interest in honest journalism?
Re:What's secure? (Score:1, Interesting)
Re:What's secure? (Score:2)
You really start to notice the vulnerabilites if you install a fresh copy of win2k and have to patch it up. Takes about 30 minutes and 7-8 reboots. You are partially correct though, the actual win2k system itself has had very few vulnerabilites, most are due to the add-ons such as IE, OE, and IIS.
Re:What's secure? (Score:4, Informative)
IE is embedded into Explorer, NOT the OS (i.e. the kernel). You can easiliy run Windows with a different shell (why?).
OS != kernel (Score:2, Insightful)
IE is embedded into Explorer, NOT the OS (i.e. the kernel).
Grandparent said "OS" not "kernel". An operating system is more than a kernel.
You can easiliy run Windows with a different shell (why?).
Why? Easy. Explorer is a RAM hog compared to alternatives such as litestep.
Re:What's secure? (Score:1)
Win2k is a pretty young OS. It's bound to have patch requirements.
Re:What's secure? (Score:3, Insightful)
He never wrote that either. The OS is not the kernel, as Stallman would be more than happy to tell you. You yourself call Win2k an "OS", would you not agree that IE is integrated into Win2k?
can you tell me you can install a linux build from 3 years ago
As soon as you can find me a three-year old Linux distro STILL BEING SOLD AS NEW.
Microsoft could easily have patched their master disc and manufactured new Win2k Server CDs at any time during these three years since the initial release but they have not done so. They are still making and selling software that they know is defective without even a token attempt at fixing the most glaring security holes in their product. In my book, this not only borders on criminal negligence, it's a fucking full-scale invasion over said border.
Would you take kindly to Ford opening up an old warehouse and selling three-year old Explorers with three-year old Firestone tires labeled as "NEW FROM THE FACTORY"? No? Why not?
Win2k is a pretty young OS. It's bound to have patch requirements.
Three years is not young in the OS business (even if you take the time to read the years cited in the copyright notice when it boots). Considering the time and effort that Microsoft spent making it, they should have done a better job.
Re:What's secure? (Score:3, Informative)
Hrm, thats funny. I have win2kpro cds here that are naked, or have sp1 already integrated, or have sp1 and 2 integrated. I can choose which cd to use, and i usually go for the latest one. This also is the case with win2kserver, the ones we have here have sp1 integrated. So your wrong, buy Win2k (either version) and MS will have done what you are saying they havent, and upgraded the base OS installed.
Re:What's secure? (Score:2)
OK, I'll admit my copies of Win2k Server are almost a year old now, but both were buck nekkid when I got them from the store. Hm, maybe the cheap, slimy bastards kept old copies on the shelves and sold them off? I wouldn't put it past them - they were bought in Microsoft's personnel shop in Redmond...
Are your copies regular retail copies or MSDN? If they're retail, I'll happily retract my statements and applaud Microsoft's efforts on this.
Re:What's secure? (Score:3, Informative)
What's more concerning than the need to install the security patches is the large number of known and unpatched vulnerabilities, which are still exploitable on most up-to-date Windows desktops.
I think you shouldn't need to reboot more than twice to install those patches, as the hotfixes can be combined using QCHAIN.EXE.
Re:What's secure? (Score:1)
Re:What's secure? (Score:2, Funny)
Re:What's secure? (Score:2, Informative)
Or, y'know, the version of Outlook that was spreading all those nasty worms.... it probably had some holes too.
Re:What's secure? (Score:2, Funny)
So is that why I get script monkeys flooding my webserver with crap like this?
146.83.216.249 - ... "GET /MSADC/root.exe?/c+dir HTTP/1.0" 404 1003 ... "GET /c/winnt/system32/cmd.exe?/c+dir HTTP/1.0" 404 1003
146.83.216.249 -
Didn't need no millions-of-dollars report to convince me!
Re:What's secure? (Score:2)
So is that why I get script monkeys flooding my webserver with crap like this?
....yes. If apache has a hole that allows root access, is it the OS's fault?
The problem with your argument (Score:1)
Im at the karma cap... (Score:3, Insightful)
Re:Im at the karma cap... (Score:2)
Huh ? Like, use it to educate upper management in a civilized manner ??
Sometimes It's like trying to stone them to death with popcorn, but I belive sooner or later there will be enough reasons to "just say no to Microsoft". And when that time comes, they will need as correct information as possible to evaluate the possibilities.
Any Linux distros EAL4 or higher? (Score:4, Funny)
Well (Score:4, Interesting)
Re:Well (Score:3, Insightful)
Re:Well (Score:1)
I know there actually are additions to the NT 5.1 kernel (so new Native calls), but I'm wondering if it wouldn't be backward compatible with NT 5.
Re:Well (Score:4, Informative)
This seems to me longer than the time for which Windows 98 was allocated, but not for server releases. I heard or read somewhere that the lifecycle had been extended, but I could be mistaken. Either way, this gives it another 2-4 years of usage. I'm not sure whether thats useful or not. Product Lifecycles [microsoft.com]
Millions of Dollars on documentation?! (Score:1, Interesting)
Re:Millions of Dollars on documentation?! (Score:2, Informative)
EROS: The Extremely Reliable Operating System (Score:3, Insightful)
only secure when it's powered down (Score:2, Funny)
yeah, right. only when both systems are turned off
Re:only secure when it's powered down (Score:3, Insightful)
Re:EROS: The Extremely Reliable Operating System (Score:3, Interesting)
Thank you for saying that out... there is nothing more valuable than a sysadmin who knows his platform.
I've been hearing a lot of moft-is-not-secure 'proofs' lately... I'm just wondering: has anyone actually proven that the OS is structuraly (ie by design) flawed?
A structural flaw for example would be that files have ACLs, but pipes don't. Or something of the sort... *not* that the default out of the box configuration leaves a NULL ACL on the \system32\cmd.exe (that is not a structural problem, it's configuration).
So long as someone doesn't show real facts when they claim to 'proove' something, it's FUD pure and simple AFAIConcerned.
Re:EROS: The Extremely Reliable Operating System (Score:1)
it thought it is a design flaw
like the kind you mentioned.
http://slashdot.org/article.pl?sid=02/08/06/182
so windows is flawed by design, in my opinion at least.
Re:EROS: The Extremely Reliable Operating System (Score:2, Interesting)
Despite your probable pleas to the contrary, you were not a regular user when you carried that out. Windows has ACLs on virtually everything in the OS (contrary to Linux, for example, with its incredible large granularity security), and the registry is no exception. The HKLM registry branch has only READ access for anyone but System and Administrators (in some cases also Power Users, which much like Administrators is not an account that you should regularly run under). The registry applications abide by these permissions quite simply because they can't fail not to. I see two possible scenarios here, one that you were in an account as PowerUser or Administrator, or two that there is a complex fault that somehow bypassed the ACLs. I suspect the former as being dramatically more likely.
Having said, you weren't actually trying to do that in a serious way, were you? (copying the tree from 98 to 2K) As a sidenote, virtually all Windows variants keep one or more backup to the registry tree, and choosing "last known good configuratin" would have fixed it for you immediately.
Re:EROS: The Extremely Reliable Operating System (Score:2, Interesting)
This means that any program can screw my registry enough to leave the system unbootable. What's the point in running as normal user, then? Just try to rm -rf
On Linux, if I want to try a suspicious program I can create a new user account and try it there. If I want to be more paranoid I can chroot it and use strace to find what exactly it's doing.
Now, if in Win2K it's possible to break the whole system as a normal user, where's all that security it's supposed to have?
Also, what registry tree? I've seen no detailed help files explaining every key of the Windows registry, what it's used for, and what would happen if it had too restrictive permissions. If those permissions are so badly set from the beginning it makes me think the reason is that many programs will break when they're unable to write to some places. If changing those ACLs would give me better security at the cost of breaking half of my programs, thanks, I don't want it. Linux works much better.
Theo de Raadt (Score:1)
Re:Theo de Raadt (Score:3, Interesting)
You mean have MS pay Theo and everyone connected with the OpenBSD project enouh to persuade them that taking it proprietary and rebranding it Windows XX is A GOOD IDEA, right? Continuously checking Windows OS and applications for security fuckups is too big a job for one person, and probably too big a job for 1,000 persons.
Would the OpenBSD team sell out for $10 billion and the right to oversee future development?
Note that this would actually be an intelligent and cost-effective thing for MS to do, even if various code libraries have to be rewritten to avoid the use of GNU code of any sort, so we can take for granted that they'll never think of it for themselves.
While this is a lot more than MS paid for the rights to what later became MSDOS ($30K, IIRC), times have changed.
While this breaks compatibility with all MS applications, does anyone actually think anything less has the remotest chance of doing the job? Assuming the job is building a reasonably secure OS that can be made to work with a wide range of applications.
Umm... (Score:4, Informative)
Re:Umm... (Score:1)
Breaking "off". (Score:3, Funny)
The computer was off during the test.
Utter nonsence (Score:1, Interesting)
Oh and if you use Visual SourceSafe, then you're covered. "Automated configuration management." Hogwash. This no more ensures you have a secure system than Suzi the Secretary checking to make sure you badged in the main door instead of surfing in behind Bob. Sure, you it is tough to have a secure system without some kind of ConfigMan, but it is not necessary and sufficient that having one ensures a secure system.
Oh, are all your tools identified (shades of ISO-9000!)? Golly, gee that's nice. So, we gonna check to see if all the old Lan Man code (which authenticates without credentials) is out of the current build? No? Oh, not a requirement.
What a load of tripe. I wonder how much they paid to have this cert. Probably more than an MCSE, and just as worthless.
There are no sufficient conditions in security (Score:5, Insightful)
There is nothing which *would* constitute a sufficient condition for security. You can't check any particular property, of the product or process, and say "Yup, it's secure." We should all know that by now. In general, the closest we come is to haul out a long list of known mistakes (the absence of which is a necessary but not sufficient condition) and hope not to find them.
It's also helpful to remember that the Common Criteria don't define try to define a reasonable security certification. What they do provide is a list of things which might be interesting and ways of measuring those things. It's up to the "end user" to choose which things are important to them (define a protection profile).
Compare with the Orange Book (Score:3, Informative)
The sort-of-precursor to the CC, the DOD-5200.28-STD (Orange Book) [dynamoo.com] specified exactly who needed to be in the testing team. For "Division C" (Windows NT 4.0 is rated C2):
For higher security classifications, the qualifications of the testing team get higher. For Division A you need at least one individual with a bachelor's degree in Computer Science or the equivalent and at least two individuals with masters' degrees in Computer Science or equivalent.So, Safety Cap's point is well made - the method of testing and the personnel carrying it out is just as important as the technical criteria.
In case of shashdotting, full text, IANAKW, etc (Score:3, Informative)
Jonathan S. Shapiro, Ph.D.
Johns Hopkins University Information Security Institute
By now, you may have heard that Microsoft has received a Common Criteria certification for Windows 2000 (with service pack 3) at Evaluation Assurance Level (EAL) 4. Since a bunch of people know that I work on operating system security and on security assurance, I've received lots of notes asking "What does this mean?" On this page I will try to answer the question. For the impatient the answer is:
Security experts have been saying for years that the the security of the Windows family of products is hopelessly inadequate. Now there is a rigorous government certification confirming this.
Since that's a pretty strong statement, bear with me while I try to explain it in plain English.
How a Security Purchase Should Work (In Abstract)At the risk of telling you something you already know, here is how a purchaser ought to proceed when buying a security product:
Assess your needs. Determine what your requirements are.
Decide which product you are most confident will meet those needs.
Buy and deploy it.
Each of these is potentially an involved process, and most customers don't have the expertise to do them effectively. Even if you did, Microsoft (or any other vendor) isn't likely to let you examine their code and design documents in order to evaluate their product.
The purpose of the Common Criteria process is to develop standard packages of commonly found requirements (called Protection Profiles) and have a standard process of independent evaluation by which an expert evaluation team arrives at a level of confidence for some particular software product.
As a customer, this makes your life simpler, because you can compare your needs against existing requirements constructed by experts and then see how well the software you are buying meets those requirements. Security requirements are fairly hard to write down correctly, but if the resulting document is annotated properly they aren't all that hard to understand.
Obviously, if you don't know your needs (requirements) you don't stand much of a chance of getting them met. Likewise, if you don't know what requirements a software product was evaluated against, the evaluation result isn't terribly useful to you in practical terms.
How Common Criteria WorksFrom the customer perspective, a Common Criteria evaluation has two parts:
A standardized requirements specification called a Protection Profile that says what the system is supposed to do. Sometimes there will be more than one of these -- usually a general baseline protection profile and then some others describing additional, specialized requirements.
An evaluation rating. This is basically an investigation by well-trained experts to determine whether the system actually meets the requirements specified in the protection profile(s). The result of the evaluation is an "Evaluation Assurance Level" which can be between 1 and 7. This number expresses the degree of confidence that you can place in the system.
In order to understand the result of an evaluation, you need to know both the evaluation result, which will be a level between EAL1 and EAL7, and the protection profile (the requirements that were tested). Given two systems evaluated against the same protection profile, a higher EAL rating is a better rating provided the requirements meet your needs.
Knowing that a product has met an EAL4 evaluation -- or even an EAL7 evaluation -- tells you absolutely nothing useful. It means that you can have some amount of confidence that the product meets an unknown set of requirements. To give a contrived example, you might need a piece of software that always paints the screen black. I might build a piece of software that paints the screen red with very high reliability, and get it evaluated at EAL4. Obviously my software isn't going to solve your problem.
The Windows 2000 EvaluationMicrosoft sponsored an evaluation of Windows 2000 (with Service Pack 3 and one patch) against the Controlled Access Protection Profile (plus some enhancements) and obtained an EAL4 evaluation rating. This is most accurately written as "CAPP/EAL4".
Problem 1: The Protection ProfileThe Controlled Access Protection Profile [ncsc.mil] (CAPP) standard document can be found at the Common Criteria website.Here is a description of the CAPP requirements taken from the document itself (from page 9):
The CAPP provides for a level of protection which is appropriate for an assumed non-hostile and well-managed user community requiring protection against threats of inadvertent or casual attempts to breach the system security. The profile is not intended to be applicable to circumstances in which protection is required against determined attempts by hostile and well funded attackers to breach system security. The CAPP does not fully address the threats posed by malicious system development or administrative personnel.
Translating that into colloquial English:
Don't hook this to the internet, don't run email, don't install software unless you can 100% trust the developer, and if anybody who works for you turns out to be out to get you you are toast.
In fairness to Microsoft, CAPP is the most complete operating system protection profile that is presently standardized. This may be the best that Microsoft can do, but it is very important for you as a user to understand that These requirements are not good enough to make the system secure. It also needs to be acknowledged that commercial UNIX-based systems like Linux aren't any better (though they are more resistant to penetration).
Note that the "Don't install software" part means that you probably shouldn't install a word processor. On several occasions Microsoft has unintentionally shipped CD's with viruses on them. A CD with a virus qualified as "malicious system development."
Problem 2: The Evaluation Assurance LevelHaving described the requirements problem, I now need to describe the problem of the EAL4 evaluation assurance level that Windows 2000 received.
As I mentioned before, EAL levels run from 1 to 7. EAL1 basically means that the vendor showed up for the meeting. EAL7 means that key parts of the system have been rigorously verified in a mathematical way. EAL4 means that the design documents were reviewed using non-challenging criteria. This is sort of like having an accounting audit where the auditor checks that all of your paperwork is there and your business practice standards are appropriate, but never actually checks that any of your numbers are correct. An EAL4 evaluation is not required to examine the software at all.
An EAL4 rating means that you did a lot of paperwork related to the software process, but says absolutely nothing about the quality of the software itself. There are no quantifiable measurements made of the software, and essentially none of the code is inspected. Buying software with an EAL4 rating is kind of like buying a home without a home inspection, only more risky.
The Bottom Line for Windows 2000In the case of the CAPP protection profile, there actually isn't much point to doing anything better than a low-confidence evaluation, because the requirements set itself is very weak. In effect, you would be saying "My results are inadequate, but the good news is that I've done a lot of work so that I can be really sure that the results are inadequate.
In the case of CAPP, an EAL4 evaluation tells you everything you need to know. It tells you that Microsoft spent millions of dollars producing documentation that shows that Windows 2000 meets an inadequate set of requirements, and that you can have reasonably strong confidence that this is the case.
ConclusionSecurity isn't something that a large group can do well. It is something achieved by small groups of experts. Adding more programmers and more features makes things worse rather than better. Microsoft has been adding features demanded by their customers for a very long time.
It is possible to do much better. EROS [eros-os.org], a research operating system that we are working on here in the Systems Research Laboratory [jhu.edu] at Johns Hopkins University, should eventually achieve an EAL7 evaluation rating, and is expected to provide total defense against viruses and malicious code. It won't be compatible, because the most important security problems in Windows and UNIX are design problems rather than implementation problems. In fact, none of the viable research efforts toward secure operating systems are compatible with existing systems.
It remains to be seen whether EROS or one of the other attempts to build secure operating systems will prevail, but better solutions are coming.
Jonathan Shapiro is an Assistant Professor in the Department of Computer Science [jhu.edu] of Johns Hopkins University [jhu.edu]. He has been working on operating system security and assurance since 1991. His past research has yielded both formally verified security properties and dramatically improved performance results in secure operating systems. His current research focuses on tying these results together into a complete, usable system, and on evaluating and testing the correctness and reliability of the resulting system.
Dr. Shapiro is also member of JHUISI [jhu.edu], the Hopkins Information Security Institute.
I was also interested in EROS... (Score:2)
Now that he's going with an MPL-style license, I guess he might be able to get more people interested. Unfortunately, like the GPL, there only room for one product in that ecological niche at a time, and Linux is already there.
While capabilities are an interesting approach, I don't think this really has any bearing on the Microsoft certification, unless the intent of mentioning EROS was to make fun of the certification?
-- Terry
Auditing Win2k Security (Score:4, Informative)
The trouble I find is that I'm able to evaluate the level diligence the IT staff at any given company has taken, I'm able to audit the level of (attempted) compliance to any documented security policy and I'm even able to assess internal security configuration and controls.
Ultimately though, I'm signing off on audit opinions that ALWAYS says and feeling a little sick about it. If we got sued, I could provide documentation proving that I diligently checked security and based on "accepted" business standards the security was implement at a reasonable level. Basically, I could cover my ass.
Is there anyone out there that has an audit program for Win2k that they would feel comfortable using to tell the auditors that they can rely on the numbers? Just curious.
Oh, BTW, the auditors could care less about Common Criteria and even though they're thick as pudding about IT, they're still smart enough o bring in outside people when they need to rely on any computer's numbers.
Re:Auditing Win2k Security (Score:1)
In my case, I worry about the outside in, not really the inside out.
1) I run all machines through port scans, and ensure that none of them have the ability to run web/ftp/game servers, ensuring that the machines are not giving out information to the world through open ports, makes me sleep a little better at night.
2) Scare the living crap out of my users, train them to hell, and contantly remind them of what they can't and can do, let them know about the new virus, the new hack, and how THEY can prevent this to happen to them. Change their passwords often, do not let them leave their computers logged in. And if they are bad, I have persmission to strike them down, by removing internet surfing, or other "goodies". etc, etc.
It is not much of an "audit" but thats the way I do my workstation security around here.
Security comes thru process not via a program (Score:2)
I dont know why you are feeling uncomfortable with your methodology. What you are doing is exactly what needs to be done. There is no one program or set of programs that can be run to assess the security level of any organization. The best that can be achieved is to take a snapshot in time of the currently known security exposures and then check to see whether there are defenses against the exposures. However, this doesnt guarantee that new exposures are covered. The only way one can have an assurance that future exposures will be covered is by examining the process that the organization has and the level to which the process is being followed.
Now, why exactly do you think that this only results in "adequate" security measures? Strike out Win2000 in your post above and replace it with Linux / Solaris / whatever you think is secure. What could you do when auditing installations of those operating systems that you arent already doing for Win2000?
Re:Security comes thru process not via a program (Score:3, Insightful)
So, if I find shitty security (doesn't matter what OS) I report on it. If I'm satisfied, I report on it.
Problem I encounter is that Win2K, I haven't found a good audit document (program) yet. So even if there is great Win2K security (which ALWAYS means it's bundled in with other security, and ALWAYS means they have a good security policy), I have a hard time prooving it. Similarly, when I find bad Win2k securuty and am called on to proove it (proof in an audit opinion sense, not the same as trying to explain active directories to senior management) I have a hard time.
Re:Security comes thru process not via a program (Score:3, Informative)
Re:Auditing Win2k Security (Score:2)
Re:Auditing Win2k Security (Score:2)
I think for a cleanly wrapped up tool, you need to look at the high end market. There are some free tools that MS provices (MBSA, which is a gui hfnetchk), and also tools in the resource kits, but not one click and point tool.
ostiguy
Other discussions (Score:5, Informative)
It was followed by a short lived, but lengthy discussion [worldtechtribune.com] with regular readers of worldtechtribune (including the editor-in-chief apparently) and some other newsforge readers.
You may or may not find some interesting thoughts, or just more (mis)information.
one basic reason why windows security sucks (Score:4, Interesting)
Firewall (Score:1)
What is ISA server?
(Other than a misleading - Acceleration?)
Where do you draw the line? (Score:5, Insightful)
Where do you draw the line? Microsoft is stuck between a rock and a hard place here. On one hand, if they don't put in a firewall, people will complain that they have to buy additional software or hardware to secure the OS (which is true.) On the other hand, if Microsoft does add a firewall, Norton, Symantec, and 50 other "personal firewall" software makers would scream bloody murder: "Microsoft is leveraging their OS monopoly to put us out of business!"
I'd guess the crappy firewall built into XP is a sort of compromise. On one hand, you don't want millions of unsecured Windows boxes running around on the Internet. So Microsoft surreptitiously adds an incoming-packets-only firewall to XP. Sure, it's a crappy firewall, and it doesn't offer real protection. But it keeps the firewall software makers at bay, and it keeps Microsoft out of the Justice Dept. gray area.
Most sysadmins would buy a hardware firewall or dedicated NAT device with firewall anyway... so at least in corporate settings, that problem is solved. Really, it's going to be tough for Microsoft to add any decent programs to the OS at this point, since they've already been found guilty of illegally bundling Internet Explorer. I'd watch for more stuff to be attached to Office or offered as a free download instead.
Re:Where do you draw the line? (Score:1)
They can even steal BSD's if they wish!
Re:one basic reason why windows security sucks (Score:1)
1. Win2k has packet filtering built into TCP/IP.
2. ISA server is a proxy/firewall. You can always *buy* it.
3. Your suggestion of using XP as a server totally broke your credibility.
4. You can always download a free firewall if you're worried (Tiny Personal Firewall comes to mind).
5. What gives you the right to claim that you would never put a Win2k box straight on the net, but you would a *nix box? According to CERT, there were three times as many security holes in open source OSes this year than MS. So, personally I'd be more leary of putting a Linux box right on the net than I would an MS box. Nuff said.
Re:one basic reason why windows security sucks (Score:1)
are you serious? the slapper thing required gcc on the server. what the hell is gcc doing on a server, anyways? also, it affected a tiny percentage of the servers out there, compared to every windows server when a windows hole arises.
taken as a whole, *nix, and even just linux, security is far better.
i wouldn't put just any box right on the net. but the difference is this: when you use windows, you roll the dice and hope that microsoft has fixed all the holes, and when a new one arises, they jump to it.
when you use an open source OS, you can be assured that fixes will come faster and better. i think the SSL hole in IE and konqueror point this out. a fix was out in less than a day for konqueror. m$ wouldn't even acknowledge the hole, then took forever getting around to fixing it.
it's not so much the OS, but who ya gonna put your faith in...
Re:one basic reason why windows security sucks (Score:2)
Second, it's not the slapper worm that worries me -- it's those constant buffer overflows in this daemon and that daemon, shoot, even the bulletproof BIND just had a vulnerability. I actually trust IIS; I for one always disabled all the script mappings (I didn't need anything beyond ASP support).
There are ways to properly secure a Windows box, you just need to know what you're doing. (There *are* ways of hotfixing the server w/o browsing to Windows update). And regardless of the OS, you need to keep up on your security updates.
Re:one basic reason why windows security sucks (Score:1, Insightful)
secondly, i'd hardly call bind "bulletproof", given its legendary legacy of remote root exploits. there are other free alternatives, and after this last discovered vulnerability, people are switching in droves.
unfortunately, iis too has a legendary legacy, prompting this whole stupid discussion.
i think the real issue here is when you have a homogeneous, ms-only setup (xp on iis, on x86), you're far more likely to be bitten, sooner, than on a hetrogenous system (some random bsd on some crazy big-endian cpu). no, i'm not invulnerable, just less succeptible to nusance worms if i happen to miss an advisory.
also take into consideration that all those daemon vulnerabilities you may encounter probably effect a very small portion of the open source population. how many are remote? how many are root? how many are even installed by default? how many would work on a big endian platforms? taken to an extreme, how many would work on a headless netbsd dreamcast running all-chrooted daemons on a read-only filesystem?
i will agree, however, that if you take the proper precautions and vigilantly harden your machine, you probably won't have any problems regardless of the platform. however i also think that open source software makes that job much, much easier.
Re:one basic reason why windows security sucks (Score:3, Interesting)
and then later on...
"People always bitch at MS for bundling software into their OS, but there's no excuse to not include reasonable packet filter ability in the OS."
Well you've certainly proved one thing. People with certifications can often oversell themselves as experts when they really know very little about the products.
Psst... I share the bounty of a simple google search. [ntsecurity.net]
Re:one basic reason why windows security sucks (Score:2)
ostiguy
Re:one basic reason why windows security sucks (Score:2)
Psst... I share the bounty of a simple google search.
The little IPSec hack does indeed allow you to do firewall-type stuff (ooh, you can filter based on IP addresses!), but by no means can it be considered "a reasonable packet filter ability." It is not a stateful firewall. It's not even close to a stateful firewall because you can't filter based on various headers. FWIW, it won't filter based on device (although I've never seen a multi-homed Windows box). It won't filter broadcast or multicast traffic. Also, it won't filter based on the originating program on the local machine (something very popular in Windows host-based firewalls but which I've never used as I work primarily with Unix machines and network devices). I don't have a windows box handy at the moment so I can't verify this, but I don't think it will even log blocked packets, not to mention allow you to specify what to log.
The little IPSec hack basically sucks as a firewall. However, it's better than nothing - you can restrict all the MS networking stuff to originate only from within your organization, which means many fewer boxes are rooted by the irc kiddies (various political reasons why we can't implement such a policy at the organizational level, which is what I'd like to do). Not something to rely upon, but it does slow down the rate of comprimises to a manageable rate.
Re:one basic reason why windows security sucks (Score:2)
Again, something the original poster claimed didn't exist.
Re:one basic reason why windows security sucks (Score:2)
(I don't do Windows, but these things are quite well-known, I guess.)
Re:one basic reason why windows security sucks (Score:2)
Re:one basic reason why windows security sucks (Score:3, Informative)
It's MCSE, and I don't think MCSA was a typo.
but there is a microsoft certification called MCSA (like MCSE but harder apparently...).
Re:one basic reason why windows security sucks (Score:2)
http://www.microsoft.com/traincert/mcp/
Shooting at the Wind (Score:4, Insightful)
Given that Microsoft constantly modifies shared portions of its Operating Systems via Service Packs, Windows Update, and while installing new applications...well, precisely how meaningful is any declaration of the security of a given Microsoft OS? Just tracking WHAT you have on a given Windows box is enough to make most sysadmins break out in hives.
If you have any software configuration that strays more than trivially from the one tested for security than the certification isn't really relevant.
My Bathroom Door . . . (Score:4, Funny)
It protects me against threats of inadvertent or casual attempts to breach the system security, like people walking in while I'm, uhh, ya know.
Of course it does nothing when someone disables the lock or tries to kick the door in.
Re:My Bathroom Door . . . (Score:2)
A.NO_EVIL_ADM is the assumption that noone is trying to break the system.
A.COOP is the assumption that everyone using the system is working in harmony to support the aims of the system.
Huh? (Score:1)
Basically Windows 2k security is "certified" as secure as closing the bathroom door while, well you know, making a deposit. But not "certified" secure if someone, anyone, is *trying* to do something bad, other than make a deposit.
((Windows Security == closing door) + bathroom humor + on topic ==> funny)
Re:My Bathroom Door . . . (Score:4, Funny)
Masturbating?
For this kind of use you may want more security like that provided by a combination of bedroom_door and blanket. This combination both prevent accidental security breach (when bedroom_door is secured) and allows you to secure your assets when security is breached by providing a camouflaging apparatus (blanket or similar) while you securely hide your data.
Re:My Bathroom Door . . . (Score:2)
I've noticed an interesting concept... (Score:4, Insightful)
Microsoft more than anything has pissed me off over their threat ads in certain areas. If you haven't heard them, I'd encourage people to find a way to hear them. They are shocking in their brazen "Stop being a criminal or we'll make you our woman and you'll like it." attitude.
Microsoft has been proven to be the sham it is, even by the government. When the US Government, the most incompetant bureacracy in existence says that you suck...man, you have to seriously do some soul-searching...if Gates even has one.
There are real, secure, systems out there. (Score:5, Informative)
Vendors hated NSA's old rating process. The standards were tough, NSA did the evaluations themselves, and you only got two tries to pass. After the first evaluation, NSA told you what was wrong. If you failed on the second try, that was it - you flunked. Worst case, NSA listed your product as "Class D - This class is reserved for those systems that have been evaluated but that fail to meet the requirements for a higher evaluation class."
Later, the process became much more "vendor friendly". Evaluations are performed by outside contractors, and vendors can submit their software over and over and over again until it passes. Microsoft used this process to push NT 4 through. It took years. The evaluation process is controlled by the vendor, and there are no public reports of failure.
The "common criteria" are rather weak, down near the bottom of the old NSA criteria. And the evaluation process is almost totally under vendor control, although it does have to be performed by an outside contractor acceptable to the Government.
There's better stuff out there. Currently, the most secure OS certified is the Wang XTS-300 [ncsc.mil]. This is certified to level B3 of the old Red Book criteria, which is about four notches above the level Windows 2000 just reached. Various FBI and DoD systems use Wang XTS-300, which is on Wang-built Pentium II and III systems. Wang is gone, but the product has been taken over by Getronics [getronicsgov.com], which keeps a low profile.
Read the data sheet for the XTS-300. [getronicsgov.com] It's UNIX-like, but very different inside.
Coming soon, the XTS-400 [getronicsgov.com], which runs Linux apps.
These secure systems enforce a "mandatory security" model. Data has a security level, an integrity level, and a list of compartments to which it belongs. Movement downward in security level or upward in integrity level is prohibited, as is movement out of a security compartment or into an integrity compartment. This is very restrictive, but it's the only approach known to have any chance of really working.
Re:There are real, secure, systems out there. (Score:2, Interesting)
(Hey just went looking, looks as if Primos Revision 21.0.1DODC2A got to the C2 level, so maybe this is more similar then I think)
Re:There are real, secure, systems out there. (Score:3, Insightful)
Re:There are real, secure, systems out there. (Score:3, Informative)
Without hardware enforcement of the abstraction barrier, your user space code could jump int the kernel at spots right after privledge checks, or could manipulate the MMU and get raw acess to every device and every memory location. This would make privledge escalation trivial.
As long as you have 2 (properly designed) rings supported by hardware, you can emulate as many rings as you want, but you'll pay a performance hit.
One important note is that all XBox code runs in ring 0 and in a single address space (unless a devloper goes WAAYYY out of thier way). This is for performance reasons, but if there's an exploitable buffer overflow in a game, it's mre than a "root" exploit, it's a kernel exploit. (Yes, both Linux and WinXP allow superusers to modify the running kernel, so the distinction is moot in these cases.) This wou;d allow for a software "mod chip".
Re:There are real, secure, systems out there. (Score:2, Insightful)
B1 systems have mandatory access control, and is a lot like the new LSPP profile in CC. B2 introduces covert channel control, which IMHO is overkill, mostly. (Not to mention practically unsolvable.)
Higher would be nice, of course, but I'd settle for an LSPP system with really good assurance!
Get the Govt. to Upgrade to Win2k (Score:5, Insightful)
The CC certification does not prove that Win2K is free from security related bugs, nor does it realisticaly prove that Win2k is secure. All it does is prove that Win2k, in certain configurations, adhears to the requirements of a EAL4 rated protection profile.
EROS / EAL4 (Score:1)
On one hand it seems interesting that one can potentially have something that "can be built to do exactly what it should and no more" but with that comes the problem (headache perhaps?) of the reauthorization of every new executable/binary/process ect that was not initially thought up during the install process. Now with persistent processes, what is one "allows" a program that is initially thought of as secure, then it is discovered that it has a horrible bug that compromises the system? Does it stop the unwanted processes, or does it allow them because the permission is already set to, with the idea in mind that if you think something is secure, it is.
Although a good idea, it can also stop one from doing some interesting things, for instance, using your web browser to look at pictures. You can easily use a Picture editing program like Gimp to view it, or you can use an image previewing device, which both are made to look at pictures, or your web browser, which is made to look at information in general that is online, but not necessarily used to preview pictures.
Now with EAL4, that is equal to Symantec Enterprise Firewall [symantec.com] (Which of course means crap if you know the flaws that are within the coding structure)
But it means EAL4 requires more through design description, a subset of implementation, and improved mechanisms and/or procedures that provide confidence that the TOE will not be tampered with during development or delivery
That leaves the impression that as long as only the developers and the beta testers have it, it could be rated EAL to the highest power... even after all the flaws are discovered.
Moot point..
Of course I am probably not seeing the whole picture, and am totally wrong...Forgive my ignorance... (Score:4, Interesting)
The above is an honest question, if you can't elaborate clearly, please don't even bother to reply.
Thank you.
Why you should care (Score:5, Informative)
In essence, like the author stated, many people are substituting education about security issues with Common Criteria certification. However, if the customer doesn't know what they want, or if they don't understand what Common Criteria does and DOES NOT check, then the customer still has no idea what they are getting. And like the author, I sometimes wonder if Common Criteria certification short cuts the basic security background required to write an RFP and replaces it with a check box for an EAL.
In particular, if you work on or sell a security product and want to sell to government or the European Union, it must be Common Criteria certified. What the certification proves, however, is up to the interpretation of the person implementing the product.
Re:Forgive my ignorance... (Score:1)
In this case, the requirements were themselves inadequate for a net connected system, and MS did a half assed job of meeting them.
The news is not really about MS, but about a method of evaluating complex software products.
Re:Forgive my ignorance... (Score:2)
Common Criteria is a set of security standards sponsored by such kind organisations as the NSA and its cousins from UK, Germany, France, Netherlands and maybe more.
So pack the bags, kiss the cat goodbye and run...
Re:Forgive my ignorance... (Score:4, Informative)
The Common Criteria replace the old NIST "Orange Book" specifications.
The CC is a certification standard set up by the NSA, NIST, and some European counterparts. It has an ISO number, too. It can be applied to any computer system (an OS, a browser, a PCI card) as long as you can clearly define the system boundary. The criteria keep alking about the target of evaluation (TOE) instead of calling it an OS, although most commonly you hear about CC being applied to OSes.
When you submit something for CC evaluation, you submit a very specific system with very specific configurations. Anything outside this narrw set of configurations isn't certified. The CC primarily look at design and documentation, so things like buffer overflows don't enter in to the equation. At the highest level (EAL 7), you need all kinds of (mathematical) demonstrations and proofs of sound design (probably mostly involving graph theory). At the lower levels, they require less rigorous proofs and deonstartaions. Basically there are a bunch of feature lists in the criteria and you need to convince the certifier that you have the required features. Good admin/user documentation and configuration tools are a big part of the CC. If it's secure, but not well documented how to keep it secure, you can forget it.
It's expensive to submit a system for certification, so even if the SELinux documentation and config tools were up to par, iit'd be unlikely anyone would pony up the cash to get it evaluated. In terms of software features, I think SELinux could cocievably be EAL 4 or quite possibly higher.
More risky? (Score:2)
He has obviously never bought anything from Fernwilter and Associates [piranhaclub.com].
Legitimate negative comments are not "bashing". (Score:3, Insightful)
From the Slashdot story: "Microsoft bashing aside..."
This kind of talk is nonsense! When someone says "Microsoft bashing", they are in effect apologizing for saying something negative about Microsoft. Apologizing is ridiculous. There are many negative things that can be honestly said about Microsoft. Apologizing by using the word "bashing" in the same paragraph as a legitimate complaint weakens the complaint, especially with people who are not technically knowledgeable.
In his November 15, 2002 Crypto-Gram newsletter [counterpane.com], Bruce Schneier says "A well-written analysis of the major security/privacy/stability concerns of Windows XP" about this article: Windows XP Shows the Direction Microsoft is Going. [hevanet.com]
(Bruce Schneier wrote major books about computer security: Applied Cryptography [amazon.com] and Secrets and Lies: Digital Security in a Networked World [amazon.com].)
The article contains only a small number of the legitimate complaints about Microsoft. I know because I wrote the article in my spare time, and there are many, many issues I have not had time to document.
Who kept Kevin Mitnick [kevinmitnick.com] in prison? Who allows Microsoft to be abusive? It's us. It is technically knowledgeable people who allow these abuses. We could be effective in our complaints. Instead, we accept a double standard in which illogical people are allowed to be illogical, but we must be completely logical or we would lose our jobs.
If you are sure of a problem, be effective in talking about it! Get your thoughts in order. Make your communication clear. Get the job done! Write an advisory letter to a government leader. Mention your ideas everywhere a lot of people are listening.
If you prevent Microsoft from being abusive, you are being charitable toward Microsoft. The company has a self-destructive side; preventing Microsoft from being abusive helps you and I personally, helps the world, and helps Microsoft. Remember, Microsoft's abusiveness causes all technically knowledgeable people to look bad to those who are not technically knowledgeable. Those with no technical knowledge are not qualified to sort out the details. We all suffer.
If you know better than the people around you, that makes you the leader! Don't accept foolishness. Don't accept implied criticism; make the speaker state his or her opinions openly. Don't accept the terms "nerd" or "geek". Those terms are used by illogical people to weaken the power of the people who are knowledgeable.
Common Criteria (Score:5, Interesting)
So I looked into it. At the time, it was called "Itsec", now it's "Common Criteria". It was run, in those days, by the electro-spooks, based in Cheltenham.
When I found what it was, I was absolutely ROFL.
I, the vendor, was expected to state the functionality of the product, what it was supposed to do, security-wise. They call this the TOE, "Target of Evaluation"
They, the evaluators, would check that it met that functionality, and give me a certificate if it did.
So far, so good. But what's the right functionality? In my case, what functionality should an antivirus have (rhetorical question, please don't tell me, except it isn't as simple as you might think).
So, I said to the people who ran the scheme, Suppose I define my functionality as "Comes in a blue box". Could I get an Itsec certification for that? The answer boiled down to "Yes, but that isn't a security issue". "Yes it is," I said.
Um. Who defines what is a security issue and what isn't? I was saying that the lack of a blue box, was a security issue. How do you say it isn't? Anyway, that's my TOE, please certify it. Well, it never got that far, that was just my way of telling them that their scheme was a joke.
So I went to a pal of mine who ran the security department at a university, suggested that he set up a certification scheme, and got the product certified under that instead. That made our marketing people happy, also our sales people. Customers had a certification to pin on the wall, everything was tickety-boo.
Except the government people, who knew they were being made monkeys out of, because I threw that "Comes in a blue box" thing at them at every conference and seminar I went to, and I heard that it started to seriously embarrass them, because people started asking questions about the value of their certifications. There's more in that thread - things did start to change, but the change didn't happen in the end.
Now, I'm not suggesting that the Microsoft certification says "Comes in a blue box." But until you've read the TOE, you don't actually know what security functions have been certified.
The only truly secure computer .... (Score:2)
As soon as you turn it on and plug it in to a network, or let someone log in and use it, all kinds of evil things can happen.
So, with the above being the most secure system, we have to make compromises. Take passwords/phrases for instance. We could specify a pass phrase of at least 60 characters with mixed case, numbers, and special characters. That might take a cracking program a little longer to break. But the odds that the casual user will remember it and not write it down someplace increases as the difficulty of the password increases.
Or, we could install smart card devices and require their usage, along with pass phrases and biometrics. But that increases costs and complexity. Not only do I need smart card readers and software at my desk, but also every system that I will use to VPN in with.
Or, we could remove all floppy disks and CD drives from our user's machines, and prevent them from downloading from the internet, but then we have to listen to them gripe all the time.
Or, we could remove Windows 2000 and use some as-yet-to-be-named totally secure, non-breakable software that provides 90% of the same functionality. But then the users would lose access to Outlook and Word and whine again because they don't want to learn something new.
Instead, we do the best with what we have, and move on. Fix the security leaks as they come up, and hope we get to them before the crackers do. Yes, I would love for MS to do a better security job, and I would also love to install Linux on the desktop. But since neither is going to happen anytime soon, we deal with it. (Although XP has finally made our CIO sit up and consider replacing MS.)
Re:The only truly secure computer .... (Score:2)
With the old Orange Book series, at least, as I haven't looked at the CCSE, as you increased in 'security' level, the emphasis shifted away from keeping people out, to being able to tell what they did.
The fact of the matter is that if your service can be used for legitimate purposes, it can be used for illegitimate purposes. Period.
Importance of Certification (Score:2, Insightful)
Actual MS Certification Test (Score:2)
Tester #1: OK, attempting to buffer overflow attack on the NetBEUI Protocol.
Tester #2: No response, good.
Tester #1: Attempting buffer overflow attack on the Messenger Service.
Tester #2: No response, good.
Tester #1: Attempting to ping the box.
Tester #2: No response, this thing is a rock.
Tester #1: Well I think it passes with flying colors then!
Tester #2: Yep, lets go to lunch.
MS Representative(wanders in after Techs have left): Hey where did those guys go? I better turn this box before they begin...
Re:Repost (Score:2)