Questioning Security Certifications 103
prostoalex writes "BusinessWeek questions the validity of security certifications in the modern world. They take a look at Federal Information Processing Standard and the certification process. Apparently 'the testing companies make money by certifying products, not catching problems' thus implying that the seal of approval might not mean a whole lot."
Re:So does this mean Oracle's unbreakable (Score:1)
I'm elated whenever I discover that a company I deal with (securely) is using OSS'd software - that way I can have a look (in my limited capacity) at the products they're using, and get that deep-down warm and fuzzy feeling.
Re:So does this mean Oracle's unbreakable (Score:1)
I've always taken online / electronic means of communication with a grain of salt. I prefer the 'less is more' approach to tendering information in any electronic environment. Stick to giving out personal details to the big companies who have more to lose, and are more likely to be able to remunerate me for my inconvenience if my information is leaked/stolen.
My IMAP/POP3 secure servers run an SSL certificate that I generated. So it's not trusted - who cares? I'm the issuer, and if somebody's using my (personal) servers for mail, they're putting their trust in me. As for a commercial application, that gets a bit hairy. If I don't have a certificate signed by a 'trusted authority', I'll be seen as unprofessional and possibly untrustworthy. But does a genuine, bona fide Verisign SSL cert guarantee my customers any more security than my up-to-date server(s) can provide? I don't see how. Almost does seem like a waste of $700+, doesn't it?
Not Uncommon (Score:5, Insightful)
At this point, if you're not always questioning whether a service provider is taking you for a ride, then you're being taken for a ride.
Re:Not Uncommon (Score:2)
I'm not the 'conspiracy theorist' type, but sometimes you have to wonder a bit.
Re:Not Uncommon (Score:2)
<quote>
Bruce Schneier, a noted cryptographic expert and chief technology officer at Counterpane Internet Security in Cupertino, Calif., never considers certifications before buying a product. "Primarily, certification is a marketing tool," he says.
</quote> (emphasis added).
Besides, now hackers will now have another monoculture to attack!
Re:Not Uncommon (Score:2)
Re:Not Uncommon (Score:1)
On the other hand, ISO certification (ex. ISO9001) is a great example of an overstated certificate that certifies, in many cases, no more than that a company wanted some ISO marketing fodder. i.e. Do something consistently, even consistently bad and you're ISO-worthy. ISO certification, rather than being a process of improving organizational skills and management, becomes more of a zero sum gain: For every person who gains one, the marketing value of it decreases.
Re:Not Uncommon (Score:2, Redundant)
Automated software cuts costs. That's why they use it. Human security testers are expensive, even though IMHO it might be a good way for the most talented script kiddies to make a buck during summer...
--The testing companies make money by certifying products, not catching problems.
Of course they do, they're _certification_ companies, not tech support for security problems. Their job is not to catch problems in your software for you. It is to tell if a product is "secure" or not, according to tests. Which bring us to the point
1) You can't predict the future. Tests run today can't reproduce new problems that will be discovered next year. So this "security certification" is short-termed at least.
2) There is a bias, both in the test suite used and the conception they have of "security". They're human beings too, and to them "good enough" can mean a whole less (or more) than to you.
So what is the problem ? The problem is that apps that pass their tests is instantly classified as "secure". So we have to
- Expand the concept of "security" to give it a little more subjectify ("secure", according to company X, not just "secure, period).
- Use peer-to-peer review, which has proven good at detecting security flaws, and is quite inexpensive for free software projects.
'nuff said.
Re:Not Uncommon (Score:2)
(Whatever)
Haste and Waste (Score:2, Insightful)
NIMDA? (Score:3, Interesting)
Then why is once an hour does my apache webserver have clients trying to access dll's in the log files? I am sure the IIS admins may not agree with that statement.
-Pete
Re:NIMDA? (Score:3, Insightful)
Re:NIMDA? (Score:1)
Certis make suits happy. (Score:1)
And a happy suit is a wonderful thing at a performance review.
Peer review (Score:5, Insightful)
Automated software cuts costs. That's why they use it. Human security testers are expensive, even though IMHO it might be a good way for the most talented script kiddies to make a buck during summer...
--The testing companies make money by certifying products, not catching problems.
Of course they do, they're _certification_ companies, not tech support for security problems. Their job is not to catch problems in your software for you. It is to tell if a product is "secure" or not, according to tests. Which bring us to the point
1) You can't predict the future. Tests run today can't reproduce new problems that will be discovered next year. So this "security certification" is short-termed at least.
2) There is a bias, both in the test suite used and the conception they have of "security". They're human beings too, and to them "good enough" can mean a whole less (or more) than to you.
So what is the problem ? The problem is that apps that pass their tests is instantly classified as "secure". So we have to
- Expand the concept of "security" to give it a little more subjectify ("secure", according to company X, not just "secure, period).
- Use peer-to-peer review, which has proven good at detecting security flaws, and is quite inexpensive for free software projects.
Script kiddies won't do at all. (Score:3, Informative)
That's why analyzing crypto software is so friggin' expensive -- to do it right takes someone who knows a great deal about not only programming and info security but mathematics as well, and who has actual experience in the field. There are only so many people who can do it right (and I'm most certainly not among them); trying to get the job done properly using the average software engineer with 5 or 10 years of general (non-security-specific) experience won't work, much less a script kiddie of any variety.
Oxymoron (Score:4, Funny)
Whoa, there's a phrase you don't see too often.
Wouldn't they be talented hackers/crackers, if they actually know their stuff?
Certifications of any Type (Score:1)
Reminds me of the days of the "Best of the Web" awards...
CodeTrap
Re:Certifications of any Type (Score:3, Interesting)
I work for the Canadian Common Criteria Scheme and it's my job to ensure that the Canadian labs follow the CC correctly and consistently in their evaluations. I found the article invaluable and disturbing (especially the Bruce Schneier quote), since we're obviously looking for ways to promote the CC, and the article highlights the concerns we need to address.
fips (Score:3, Insightful)
Re:fips (and Common Criteria) (Score:1)
A subset of the mplementation representation comes in at EAL4 and the evaluators get to pick what code they look at (based on what they find in the high and low level design docs, the user and administrator guidance, etc.).
Not assurance but better then nothing. (Score:2)
Really, the biggest problem with FIPS is the boundries it is drawn at. Typically only some (crypto) modules are certified; there may be gaping holes in other modules. So your crypto might be bullet proof, but someone may still be able to hack the box and sniff off any data they want
Charge em! (Score:3, Insightful)
This removes the conflict of interest, and in fact reverses it: the certifying authority *wants* to find problems so they can bill more hours, and the developers but their butts to keep the cost of the certification down.
Jeff
Re:Charge em! (Score:1)
Companies are in business to maximize profits. How unethical they can be in that pursuit has been the subject of quite a few recent news stories. The profit made by a certification company is determined by a balance of cost and demand. Demand is directly proportional to how easy it is to pass the certification. Perceived value of having the certification is inversely proportional to how easy it is to pass the certification. Chart these factors (and others) and you should find that there is a target failure rate that will maximize profits. A company seeking to maximize profits will design their certification process to achieve this target. This almost always results in behavior that is not optimal for the common good.
Is this ethical? Depends on where you think the primary responsibilities of a company lie...shareholder/owner profit or societal good. But that's another slashdot post.
In other related news.... (Score:2)
Re:In other related news.... (Score:2)
-Lucas
Uses and Limitations of Skill Certifications (Score:1)
It is very possible, however, to spend years as a systems engineer and never think about designing a network or about how DHCP works. It is possible to work with VB for years and never use it to build an activeX component.
Narrow experience without a broad overview of the possibilites can lead to bad decisions. Certification is one way (there are many) to get that overview, but certification by itself does not equal expertise.
And is anybody really suprised by this? (Score:2, Funny)
Yeah Sure... (Score:3, Funny)
It can be a joke (Score:3, Insightful)
I can't count the number of times that a consultant has quizzed me about firewalls - when they're pushing a certain product (usually because they get a kickback) "This is better because it's certified!"
The problem is that (on off-the-shelf products) the certification only applies to the default configuration - and if you change it (which is pretty much every time - each site has different needs) the device needs to be re-certified... The consultants never mention that part to the client.
The best way to know if a site is secure is to have an independant security audit done by someone qualified (and I don't mean a 'general auditor' - a company that specializes in perimiter security.)
Hmm... (Score:1)
According the the Orange Book.. (Score:4, Interesting)
Now, before we all laugh and say "doesn't it show that the certifications are stupid?" consider this.. maybe the certification system does work, and all those other certified products are equally flaky. I've got a list of some TCSEC-certified systems here [dynamoo.com] and frankly it's a pretty unappealing set of OSes. If there were as many Unicos [cray.com] systems (rated B1) out there as there were Windows, I betcha they'd find holes in it soon enough. The fundamental problem with any popular OS is that there will be thousands of hackers and wannabees probing away at it. I don't think there are many people reverse engineering CA-ACF2 MVS [ca.com] in their bedrooms.
I think the motto should be: "Security Through Obscurity" - perhaps all those horrid proprietry OSes did have a point after all.
Re:According the the Orange Book.. (Score:1)
Read the Fine Print (Score:1)
Re:According the the Orange Book.. (Score:2)
Everyone does not have the same machine, same CPU, memory, network, etc. etc. etc.
Is it truely possible to have a secure system with so many vars (and lest not we forgot about the keyboard sniffers dongles and other tools just to record what's being typed)?
I don't think so.
Go back to lesson from the book Cryptonomicon(sp?), you can only keep a secrete for so long before someone else figures it out.
Re:According the the Orange Book.. (Score:1)
I don't have (and don't intend to) review the MS NT evaluation document, but I would bet IIS, Exchange, Outlook, IE, etc. are NOT part of the trusted computing environment.
In fact my recollection is that very few actual security exploits have come up in the last few years dealing with native NT code.
Point being, is that maybe the base OS is pretty secure - which is all the certification says.
Re:According the the Orange Book.. (Score:1)
Re:According the the Orange Book.. (Score:1)
There is no security panacea (Score:4, Insightful)
The same applies to those practices. In and of themselves, they do not guarantee that no incident will take place. But they'll hopefully minimize the impact and frequency of those incidents. The fact that the NSA or some other entity may be able to get past your security doesn't invalidate that security entirely; depending on the environment, it may be good enough.
Information security is really all about risk management. At the end of the day, are we managing our security to the point where the risk is less than the value of the information itself? Balance business need (or whatever needs you have, if you're not a business) against the cost of extra measures. When additional measures are too expensive for the value of what you're protecting, you're secure -- at least secure enough, anyway. If everyone followed security best practices, we'd have a lot less problems than we do.
Re:There is no security panacea (Score:2)
I suppose that depends on if you consider CYA to be a form of risk management. Unfortunately, many managers look at security as an expensive pain in the ass, that does nothing but cause them problems. There is no glory in having a tight secure system, but plenty of shit hits the fan if something goes wrong and a system is breached. For this reason, I think many managers are happy to take a security document from a consulting company, meet their recommendations, and then hide behind it.
I think Cowboyneal (Score:2, Funny)
Doubts (Score:1)
ok, I want to know whose going to test our company's product, what skillset they have, I want to evaluate the evaluators, how do we know their really good? What process do they follow, or do they load up something.c into emacs and prance around?
"Finally, the FIPS engineers put the product through its paces in a testing lab to make sure all the cryptographic elements perform as promised. "
What tests and process do they do? Is this always the same? How do they learn from their mistakes? Is the process upgraded and reviewed regularly.
Re:Doubts (Score:2)
Not used to working with the government I see
Actually, I think the FIPS 140 process is actually a very good example of those concepts done right. Review the FIPS site [nist.gov].
The answer to your question about tests will be answered thoroughly, perhaps you will want to start with the derived test requirements [nist.gov] section.
Who certifies the testing companies? (Score:2)
They really should have looked into the certification process for the testing companies. I doubt that Uncle Sam lets them in without some sort of compliance standards. I guess these standards will just have to include mandatory testing by engineers instead of software.
If not, then the only thing that software developers will have to spend time guarding against are the specific areas of vulnerability that the testing houses look for.
That kind of defeats the purpose, now, doesn't it?
-S
Re:Who certifies the testing companies? (Score:3, Informative)
From this article I get the impression that any Tom, Dick, or Harry can go out, 'perform testing' and give away FIPS certs for money.
This is not the case. FIPS 140-1/140-2 test labs must be approved by NIST through a formal accreditation program [nist.gov].
"unbreakable" oracle has 15 certs (Score:3, Interesting)
Oracle now holds 15 security evaluations. DB2 has none. SQL Server has only one.
If it was easy to "buy" these certifications, I'm sure that Microsoft SQL server would have more than just one by now. (Granted, Oracle also has a bit of cash to throw around too).
What it amounts to... (Score:1)
It's kind of slashdot... (Score:2)
certificatins fail (Score:1)
FIPS is only the start of the conflicts of interst (Score:1)
Namely, the Common Criteria simply specify that you must tell the certifying body what a system will do and then it must do those things. It'd kinda like the mess that is ISO-9000 that way. Worse yet, labs are paid not by the potential purchaser, but by the entity wishing to have something certified (same as FIPS). Certified products are then "acceptable" by all countries participating in the Common Criteria evaluation scheme, meaning that poor products that are certified have a higher likelyhood of doing more damage than FIPS certified products will.
More of your tax dollars hard at work...
Re:FIPS is only the start of the conflicts of inte (Score:1)
Where have we heard this before? (Score:2)
Sounds familiar. Oh yeah -- the US Patent Office makes its money off of "user fees", and it collects a bunch more when it issues a patent then when it denies one.
Hmm...
(The real answer may be in just charging for the application (either for certification or for patent, depending on whom we're talking about), and the cert is issued (or not) without extra fee. I say "may" be the answer because that approach might encourage agencies to trivially deny applications to boost re-application fee income.)
The value of a cert is it's absence (Score:2, Insightful)
No certification can say a product is secure. A certification can only mean the product was tested and found compliant with standards. Security isn't an all or nothing characteristic. All other things being equal, a certified product is less likely to fail than one that was unable to pass the tests.
Hey, this is just like SSL Certificates! (Score:2)
FIPS 140-1 and open source crypto (Score:1)
As others have noted, FIPS 140-x validation is not a panacea; however it does add some additional (and IMO useful) product review beyond what you'd get with standard internal QA plus public review (for open source crypto products). I think it would be great if some vendor or vendors stepped up and sponsored FIPS 140-x validation for OpenSSL and other popular open source crypto implementations.
FIPS 140 & security is apples & oranges (Score:1)
How it should work (Score:2)
Liability? (Score:1)
Does anyone know how these get started?? (Score:2)
The cheapest I.T. industry cert. I've seen is still the A+, and that is a 2 part exam that sets you back around $180 by the time you take both pieces. Either the testing centers make an absolute killing on these things, or else a certain percentage of the profit goes back to the test's creator.
I'm thinking if you even get a 4% or 5% cut of the profits on each attempted exam, you stand to make more money than you would by actually working in the industry in a job you got, partially by becoming "certified".
Duh. (Score:2)
Now, the emphasis is on COTS products, which have almost all been designed with the commercial (read as, "less security-conscious") market. Concern has not been for security as much as marketability, ease of use, and appeal to the public. The designers often do not keep security at a high priority, if at all until the later stages of development. And the people who really understand the nature of the threat in high-security environments had no input, insight, or awareness whatsoever of the internal workings of the products. The products are so varied and different from each other, a checklist of hard facts to verify (as certifications are) is no longer sufficient to catch all the possible risks.
That said, the other problem is that no other method has yet been developed. It's easy to say, "Just do a vulnerability assessment." How? How do you do that on a constant basis on something the size of a government network? How do you make sure that nothing slips through the cracks? And above all else...how do you keep most of the contractors from getting in the way of the few (or single) contractor who gets the job of checking everyone else's work? At least certifications are neutral, in that they have no capability to be used by any single contractor against all the others.
But that's just my frustration
Software changes too fast (Score:3, Interesting)
Nevertheless the certification house did do a thorough check on us and did recommend a number of changes to our software. We didn't think any of them truly added security, but at least this way it was obvious that the cert company was doing their job.
The big problem is that we got that version of the software certified, taking about eight months and several employees' time. Now a few months later we come out with a new release! We can't get re-certified every time, even though they have a shortcut for recertifications. Keeping up with the short software release cycle would be way too expensive.
So we still have FIPS 140 certification listed as a feature of our product, but if a customer really wants that specific version, we have to sell him old software. As it turns out, no one does. All they really need is to be able to check the box that says we are certified, and then they're perfectly happy to take the latest software. The mere fact that we spent the time, effort and money to be certified is what really counts.
Wow, FIPS certified products actually being used? (Score:2)
I find it interesting that people are starting to specify AND demand FIPS 140. When we certified our first product, using the predecessor of FIPS 140, we only ever had one customer - the US treasury. Perhaps the chicken-and-egg situation is changing, because just enough companies have FIPS 140 certified products that customers might actually be able to buy one.
In reality, most customers will not use the actual certified product, for a couple of reasons. First, it is too expensive (and takes too long) to recertify the product for every minor version change. Second, the FIPS process only allows certain algorithms (FIPS algorithms naturally) and certain cryptographic formats. If your product wants to support the wide-spread PKCS format (RSA pseudo-standard) instead of the government preferred ANSI formats (in cases where there is a difference), those PKCS commands will have to be disabled in the FIPS version of the product.
Patents Office... (Score:2, Interesting)
They should pay them on the number of patents thrown out.
FIPS 140 - whats good and bad? (Score:2)
A lot of the FIPS philosophy came out of the military, and the testing labs impressed me with the breadth of their physical attacks. On the other hand, the military usually has very simple logical security requirements for a crypto-box. It should be inert until authorized users properly activate it, and at that point it can perform sensitive actions. Commercial cryptography designs by contrast, usually has a set of functions that needs to be generally available. They also have a much smaller set of functions that need authorized users to control.
When we put our product through the immediate predecessor to FIPS 140-1 certification, we were the first commercial product and ended-up breaking a fair-amount of new ground (somewhat painfully as you might imagine). What we had to show was that the cryptographic commands that were available to non-privileged users were safe - because of the logical security design. Even early FIPS 140-1 processes did not really deal with these "always-on" functions very well.
Although it improved, especially with the 140-2 modifications, logical security is still the real weak point. Michael Bond's [cam.ac.uk] well publicized attacks on the FIP 140-1 level 4 certified IBM 4758 security module were all aimed at the "logical security" level. My favorite example of insecure by design is the PKCS #11 security module when it is used for server security.
The Cryptoki (PKCS #11) interface was designed for security tokens, and basically works a lot like the military devices. The token (smartcard, whatever) would be plugged into the client device, where it would remain inert until activated by the user password. Actually a pretty good design when used this way.
The problem is when the same design is used for a server, which is unfortunately common since several PKI vendors standardized on using PKCS #11 security modules. PKCS #11 authorization levels are all messed-up for server use. There is no concept of "always-on" commands, or multiple levels of authorization. That means that any entity (server application) that wants to access the security-subsystem must be an authorized user.
The result is that the clear password that enables the PKCS# 11 modules has to be put into the server application. Because of that clear password an attacker no longer has to break into the PKCS #11 box or steal/forge authorized user's identities. They can gain authorized user privileges merely by monitoring the communication lines between the application and box, or by analyzing the object code of the application!
You will find a number of FIPS 140 certified PKCS #11 [nist.gov] modules, which is actually no surprise given how well PKCS #11 matches the military origins of the FIPS 140. This is a classic example of a certified subsystem that is quite secure for some uses (human insertion of a token and entry of password), but it quite insecure for others (server applications storing and using clear passwords). All the FIPS certification does in the case of PKCS #11 is tell you that the vendor has followed their design, and not if it will provide logical security in your system!
Know what certs are certifying. (Score:2)
Also, unless you have achieved a particular certification, I don't think that you're qualified to comment as to its real meaning. How do you have any clue as to what it means unless you know what you've learned from it, and how you've grown in order to achieve it? Even then, people tend to lie to themselves.
Certificiations are not meaningless. But they certainly don't mean competence. Compentence is NOT knowledge. For example... I recently worked with a man, who was CISSP certified who had no clue. Someone posted a usenet article asking what a firewall deny, outgoing, to port 4000 was probably caused by (complete with destination IP, etc)... he thought that it was a trojan (this was "here's a question, go think about it for a while" type question, not a "GIVE ME AN ANSWER NOW!" question). Now, if you don't recognise port 4000, you could always reverse DNS the IP and find that the target host was icq.mirabilis.com. This is just one example - so don't just say "he was lazy and didn't invest the time in figuring it out. The man also claimed to have written White Rabbit, played with Jefferson Airplane (on stage), invented robots that could climb stairs and learn rooms by name and that he flys to Greece in the summer in a bomber (he says that its his uncle's and that it costs ~2k in gas). Obviously, he was fired, but he had a CISSP (fairly big security compentence certification).
Now, if this post sounds non-sensical, please, understand that I am currently a bit drunk, and may not be making a lot of sense.
Last Post! (Score:1)
question.
"Master, does Emacs have the Buddha nature?" the novice asked.
The Chief Priest had been in the temple for many years and could be
relied upon to know these things. He thought for several minutes before
replying.
"I don't see why not. It's got bloody well everything else."
With that, the Chief Priest went to lunch. The novice suddenly
achieved enlightenment, several years later.
Commentary:
His Master is kind,
Answering his FAQ quickly,
With thought and sarcasm.
- this post brought to you by the Automated Last Post Generator...