Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Security

Questioning Security Certifications 103

prostoalex writes "BusinessWeek questions the validity of security certifications in the modern world. They take a look at Federal Information Processing Standard and the certification process. Apparently 'the testing companies make money by certifying products, not catching problems' thus implying that the seal of approval might not mean a whole lot."
This discussion has been archived. No new comments can be posted.

Questioning Security Certifications

Comments Filter:
  • Not Uncommon (Score:5, Insightful)

    by theduck ( 101668 ) <{moc.yugswen} {ta} {kcudeht}> on Thursday October 03, 2002 @09:07AM (#4380254)
    There are plenty of industries where following the money would make you think twice about the motivations of the seller of services. How about financial planners/brokers who make more cash by churning your investments? How about the auto mechanic who makes more cash by replacing your radiator when all it needed was an external cleaning (true personal experience here)? How about [fill in your own example here...everyone has one]?

    At this point, if you're not always questioning whether a service provider is taking you for a ride, then you're being taken for a ride.
    • On that note, and thinking about security again, has anyone else considered that if anti-virus software companies actually succeeded in stamping out the proliferation of viruses they might well put themselves out of business?

      I'm not the 'conspiracy theorist' type, but sometimes you have to wonder a bit.

    • Exactly. Here's the relevant quote from the article:

      <quote>

      Bruce Schneier, a noted cryptographic expert and chief technology officer at Counterpane Internet Security in Cupertino, Calif., never considers certifications before buying a product. "Primarily, certification is a marketing tool," he says.

      </quote> (emphasis added).

      Besides, now hackers will now have another monoculture to attack!

      • It is interesting that Schneier says that. One of the regular features in his newsletters exposes charlatan security companies that claim to provide security without revealing their methods. At least with a FIPS validation, you can be sure that the program attmepts to do what it claims to do. Hopefully labs won't pass a module that was completely bogus, because they would lose their accreditation from NIST and be out of business.(Think Arthur Anderson) That is a valuable assurance for someone who can't read code.
    • This sort of "certificates more valuable than what they supposedly represent" situation is common in educational certifications, and it's ridiculous because many people pursue "further education" (ex. MBA) not for additional knowledge, skill and abilities, but rather purely for the end certificate. i.e. Instead of further education giving you the skills that allow you to excel, instead it gives you a certificate which allows you to excel. There are countless examples that follow this pattern. The tail is wagging the dog.

      On the other hand, ISO certification (ex. ISO9001) is a great example of an overstated certificate that certifies, in many cases, no more than that a company wanted some ISO marketing fodder. i.e. Do something consistently, even consistently bad and you're ISO-worthy. ISO certification, rather than being a process of improving organizational skills and management, becomes more of a zero sum gain: For every person who gains one, the marketing value of it decreases.
    • Re:Not Uncommon (Score:2, Redundant)

      by yatest5 ( 455123 )
      --Automated software is a good baseline approach, but it falls far short of cunning humans hammering away at systems.

      Automated software cuts costs. That's why they use it. Human security testers are expensive, even though IMHO it might be a good way for the most talented script kiddies to make a buck during summer...

      --The testing companies make money by certifying products, not catching problems.

      Of course they do, they're _certification_ companies, not tech support for security problems. Their job is not to catch problems in your software for you. It is to tell if a product is "secure" or not, according to tests. Which bring us to the point :

      1) You can't predict the future. Tests run today can't reproduce new problems that will be discovered next year. So this "security certification" is short-termed at least.

      2) There is a bias, both in the test suite used and the conception they have of "security". They're human beings too, and to them "good enough" can mean a whole less (or more) than to you.

      So what is the problem ? The problem is that apps that pass their tests is instantly classified as "secure". So we have to :

      - Expand the concept of "security" to give it a little more subjectify ("secure", according to company X, not just "secure, period).

      - Use peer-to-peer review, which has proven good at detecting security flaws, and is quite inexpensive for free software projects.

      'nuff said.
    • Take your conclusion to the logical extreme and you'll be right. All transactions (commercial, personal, etc.) should be filtered in self-defense.

      (Whatever)
  • Haste and Waste (Score:2, Insightful)

    by j_kenpo ( 571930 )
    Well, all things considered, most products that get security certifications probally need more QA then they actually recieve. But most security flaws usually dont get caught until later dates due to higher user base, thus making them more of a target to attackers (with the exception of buffer overflows, which shouldnt be that hard to do a search for functions that do not do range checking on input). Even with a good QA team, its more likely that a group of 3000 hackers will find security flaws in a production system better than a group of 300 QA testers, due to poor administration, default settings, whatever
  • NIMDA? (Score:3, Interesting)

    by peterdaly ( 123554 ) <petedaly@@@ix...netcom...com> on Thursday October 03, 2002 @09:15AM (#4380281)
    Automated software is a good baseline approach, but it falls far short of cunning humans hammering away at systems.

    Then why is once an hour does my apache webserver have clients trying to access dll's in the log files? I am sure the IIS admins may not agree with that statement.

    -Pete
    • Re:NIMDA? (Score:3, Insightful)

      by cduffy ( 652 )
      If the goal is finding new security failures, the attacks by Nimda simply demonstrate the accuracy of the claim. This automated software is having no success in penetrating your system, whereas if it were a skilled and motivated human (rather than an automated system) hammering away at your system, they may well be able to find a new, effective attack. Similarly, once an IIS installation is appropriately patched, Nimda will have no more effect -- but a sufficiently skilled and motivated human might.
    • The script kiddies using automated scans barely qualify as cunning humans, they aren't the big threat.


  • And a happy suit is a wonderful thing at a performance review.
  • Peer review (Score:5, Insightful)

    by koh ( 124962 ) on Thursday October 03, 2002 @09:18AM (#4380300) Journal
    --Automated software is a good baseline approach, but it falls far short of cunning humans hammering away at systems.

    Automated software cuts costs. That's why they use it. Human security testers are expensive, even though IMHO it might be a good way for the most talented script kiddies to make a buck during summer...

    --The testing companies make money by certifying products, not catching problems.

    Of course they do, they're _certification_ companies, not tech support for security problems. Their job is not to catch problems in your software for you. It is to tell if a product is "secure" or not, according to tests. Which bring us to the point :

    1) You can't predict the future. Tests run today can't reproduce new problems that will be discovered next year. So this "security certification" is short-termed at least.

    2) There is a bias, both in the test suite used and the conception they have of "security". They're human beings too, and to them "good enough" can mean a whole less (or more) than to you.

    So what is the problem ? The problem is that apps that pass their tests is instantly classified as "secure". So we have to :

    - Expand the concept of "security" to give it a little more subjectify ("secure", according to company X, not just "secure, period).

    - Use peer-to-peer review, which has proven good at detecting security flaws, and is quite inexpensive for free software projects.

    • To be useful for testing new crypto software, people need to be able to analyze source and algorithms and develop new attacks. Script kiddies, by definition, only carry out attacks built by others -- something an automated system can do just fine. Finding new vulnerabilities rather than working from a cookbook -- that doesn't take a script kiddie, it takes an expert (or five).

      That's why analyzing crypto software is so friggin' expensive -- to do it right takes someone who knows a great deal about not only programming and info security but mathematics as well, and who has actual experience in the field. There are only so many people who can do it right (and I'm most certainly not among them); trying to get the job done properly using the average software engineer with 5 or 10 years of general (non-security-specific) experience won't work, much less a script kiddie of any variety.
    • Oxymoron (Score:4, Funny)

      by mosschops ( 413617 ) on Thursday October 03, 2002 @09:41AM (#4380425)
      talented script kiddies

      Whoa, there's a phrase you don't see too often.

      Wouldn't they be talented hackers/crackers, if they actually know their stuff?
  • Would seem to be worth only the paper that they're printed on. What security product, employee, software package are you going to try first? Everything and everyone and thier dog seems certified one way or another, but what does it all mean?

    Reminds me of the days of the "Best of the Web" awards...

    CodeTrap
    • I agree that a certification isn't worth much if you don't know what it means. I think that's a big advantage of the Common Criteria (CC). While it's a lot to digest, there are guides out there to help you through it.

      I work for the Canadian Common Criteria Scheme and it's my job to ensure that the Canadian labs follow the CC correctly and consistently in their evaluations. I found the article invaluable and disturbing (especially the Bruce Schneier quote), since we're obviously looking for ways to promote the CC, and the article highlights the concerns we need to address.
  • fips (Score:3, Insightful)

    by ciscoeng ( 411359 ) on Thursday October 03, 2002 @09:25AM (#4380342)
    Having been through a FIPS requirements meeting I generally agree with Schneir and Kocher: it can easily become a marketing tool if not taken seriously. While FIPS requires, say, certain crypto algorithms (DES, DH, DSA, etc) the physical boundary around the crypto hardware is pretty vague for level 1. Plus, as they mention in the article, you don't really know what method they use to test your product. Is it a monkey with a computer, a script, a Ph.D. mathematician, etc.
  • Sure if you have access to the source and the time/money/skill to do your own review/testing, you can do a lot better then FIPS. But what if you just want to buy a $5000 box? It would seem to me that at least you have _some_ assurance; it might not be perfect, but it is better then none.

    Really, the biggest problem with FIPS is the boundries it is drawn at. Typically only some (crypto) modules are certified; there may be gaping holes in other modules. So your crypto might be bullet proof, but someone may still be able to hack the box and sniff off any data they want
  • Charge em! (Score:3, Insightful)

    by jspayne ( 98716 ) <jeff@NOSPAm.paynesplace.com> on Thursday October 03, 2002 @09:35AM (#4380393) Homepage
    It isn't clear from the article how this company does it, but in my experience with safety certification, you get charged the hourly rate for the certification process regardless if you pass or fail. Just like having your car inspected...

    This removes the conflict of interest, and in fact reverses it: the certifying authority *wants* to find problems so they can bill more hours, and the developers but their butts to keep the cost of the certification down.

    Jeff

    • True and not true.

      Companies are in business to maximize profits. How unethical they can be in that pursuit has been the subject of quite a few recent news stories. The profit made by a certification company is determined by a balance of cost and demand. Demand is directly proportional to how easy it is to pass the certification. Perceived value of having the certification is inversely proportional to how easy it is to pass the certification. Chart these factors (and others) and you should find that there is a target failure rate that will maximize profits. A company seeking to maximize profits will design their certification process to achieve this target. This almost always results in behavior that is not optimal for the common good.

      Is this ethical? Depends on where you think the primary responsibilities of a company lie...shareholder/owner profit or societal good. But that's another slashdot post. ;)
  • This just in -- many individuals who possess MSCE certificates are in fact not performing on a par with senior software engineers.
    • Of course not. The MCSE is a system administration certification. If you want to talk about senior software development, look to those that are MCSDs and have several years of experience.

      -Lucas
    • Certifications are useful for certifying breadth of knowledge -- not depth. There will never be a substitute for experience, nor should there be. And anyone that hires a potential employee based only on the fact that they passed a standardized test deserves whatever problems they get. Similarly, anyone that believes that a piece of paper with no experience makes them qualified is also decieving themselves.

      It is very possible, however, to spend years as a systems engineer and never think about designing a network or about how DHCP works. It is possible to work with VB for years and never use it to build an activeX component.

      Narrow experience without a broad overview of the possibilites can lead to bad decisions. Certification is one way (there are many) to get that overview, but certification by itself does not equal expertise.
  • A certification that doesn't mean anything? I am shocked, really shocked?!?! A goverment standard that is useless? Again I am shocked!?!?
  • by Rayonic ( 462789 ) on Thursday October 03, 2002 @09:40AM (#4380423) Homepage Journal
    And next you'll tell me that the "Nintendo Seal of Quality" was just a way to discriminate against third party game cartridges.
  • It can be a joke (Score:3, Insightful)

    by Anonymous Coward on Thursday October 03, 2002 @09:41AM (#4380426)
    The problem is that these certifications are often the only measure that PHB's use to decide if a product is 'safe' or not, and in this case, the certification is meaningless.

    I can't count the number of times that a consultant has quizzed me about firewalls - when they're pushing a certain product (usually because they get a kickback) "This is better because it's certified!"

    The problem is that (on off-the-shelf products) the certification only applies to the default configuration - and if you change it (which is pretty much every time - each site has different needs) the device needs to be re-certified... The consultants never mention that part to the client.

    The best way to know if a site is secure is to have an independant security audit done by someone qualified (and I don't mean a 'general auditor' - a company that specializes in perimiter security.)
  • This whole "certification process" sounds a lot like how the "Nintendo Seal of Quality" worked - when you get right down to it - make sure the product does not have any obvious bugs (or in the case of FIPS - security holes), and that it worked - which basically made it something that looks nice yet means nothing.
  • by Dynamoo ( 527749 ) on Thursday October 03, 2002 @09:43AM (#4380438) Homepage
    According to the Orange Book, the now-slightly-obsolete DoD certification, Windows NT 4.0 is secure enough to get a C2 Certification [ncsc.mil].

    Now, before we all laugh and say "doesn't it show that the certifications are stupid?" consider this.. maybe the certification system does work, and all those other certified products are equally flaky. I've got a list of some TCSEC-certified systems here [dynamoo.com] and frankly it's a pretty unappealing set of OSes. If there were as many Unicos [cray.com] systems (rated B1) out there as there were Windows, I betcha they'd find holes in it soon enough. The fundamental problem with any popular OS is that there will be thousands of hackers and wannabees probing away at it. I don't think there are many people reverse engineering CA-ACF2 MVS [ca.com] in their bedrooms.

    I think the motto should be: "Security Through Obscurity" - perhaps all those horrid proprietry OSes did have a point after all.

    • I believe the Orange book certifies NT without networking.

    • Sure, NT was C2 ... if not connected to a network.
    • As someone just pointed out, NT 4 was C2 if not connected to a network, but this does raise the question and point of configuration.
      Everyone does not have the same machine, same CPU, memory, network, etc. etc. etc.
      Is it truely possible to have a secure system with so many vars (and lest not we forgot about the keyboard sniffers dongles and other tools just to record what's being typed)?
      I don't think so.
      Go back to lesson from the book Cryptonomicon(sp?), you can only keep a secrete for so long before someone else figures it out.
    • To launch a rare defense of Microsoft, all C2 certifies is the basic OS plus maybe the few other components making up the "trusted computing base"

      I don't have (and don't intend to) review the MS NT evaluation document, but I would bet IIS, Exchange, Outlook, IE, etc. are NOT part of the trusted computing environment.

      In fact my recollection is that very few actual security exploits have come up in the last few years dealing with native NT code.

      Point being, is that maybe the base OS is pretty secure - which is all the certification says.

    • To throw a interesting spin on things, the person who certified NT4 as C2 compliant (sans networking, as has been pointed out) refused to certify Win2K as such.
      • I'm definitely not a Microsoft fan, but this is not true. SAIC was the organization that did the NT4 C2 evaluation, and they're in the process of doing a Win2K evaluation. See http://www.radium.ncsc.mil/tpep/epl/entries/TTAP-C SC-EPL-99-001.html for the NT evaluation, and http://niap.nist.gov/cc-scheme/InEvaluation.html for the Win2K ongoing effort. Incidentally, it was NT 3.5 that was certified without networking... NT 4.0 included networking in their evaluation. See http://www.radium.ncsc.mil/tpep/epl/entries/CSC-EP L-95-003.html
  • by El Volio ( 40489 ) on Thursday October 03, 2002 @09:45AM (#4380443) Homepage
    If you think there is, you're fooling yourself. That said, as long as that axiom is kept in mind, something is better than nothing. FIPS (or any other certification) may not be a guarantee, but it should be a good indicator that due diligence has been performed and the software meets widely-accepted best practices.

    The same applies to those practices. In and of themselves, they do not guarantee that no incident will take place. But they'll hopefully minimize the impact and frequency of those incidents. The fact that the NSA or some other entity may be able to get past your security doesn't invalidate that security entirely; depending on the environment, it may be good enough.

    Information security is really all about risk management. At the end of the day, are we managing our security to the point where the risk is less than the value of the information itself? Balance business need (or whatever needs you have, if you're not a business) against the cost of extra measures. When additional measures are too expensive for the value of what you're protecting, you're secure -- at least secure enough, anyway. If everyone followed security best practices, we'd have a lot less problems than we do.

    • Information security is really all about risk management.

      I suppose that depends on if you consider CYA to be a form of risk management. Unfortunately, many managers look at security as an expensive pain in the ass, that does nothing but cause them problems. There is no glory in having a tight secure system, but plenty of shit hits the fan if something goes wrong and a system is breached. For this reason, I think many managers are happy to take a security document from a consulting company, meet their recommendations, and then hide behind it.
  • should do the certification testing. I would buy any product that has the Cowboyneal(tm) seal of approval.
  • "LENGTHY PROCESS. Next, the FIPS engineers will study the basic code of the cryptographic module looking for underlying security flaws that could leave the module open to being compromised."

    ok, I want to know whose going to test our company's product, what skillset they have, I want to evaluate the evaluators, how do we know their really good? What process do they follow, or do they load up something.c into emacs and prance around?

    "Finally, the FIPS engineers put the product through its paces in a testing lab to make sure all the cryptographic elements perform as promised. "

    What tests and process do they do? Is this always the same? How do they learn from their mistakes? Is the process upgraded and reviewed regularly.

    • What tests and process do they do? Is this always the same? How do they learn from their mistakes? Is the process upgraded and reviewed regularly.

      Not used to working with the government I see :-)

      Actually, I think the FIPS 140 process is actually a very good example of those concepts done right. Review the FIPS site [nist.gov].

      The answer to your question about tests will be answered thoroughly, perhaps you will want to start with the derived test requirements [nist.gov] section.
  • From this article I get the impression that any Tom, Dick, or Harry can go out, 'perform testing' and give away FIPS certs for money.

    They really should have looked into the certification process for the testing companies. I doubt that Uncle Sam lets them in without some sort of compliance standards. I guess these standards will just have to include mandatory testing by engineers instead of software.

    If not, then the only thing that software developers will have to spend time guarding against are the specific areas of vulnerability that the testing houses look for.

    That kind of defeats the purpose, now, doesn't it?

    -S
  • by kirkb ( 158552 ) on Thursday October 03, 2002 @10:22AM (#4380661) Homepage
    At http://www.oracle.com/ip/deploy/database/oracle9i/ you will see:

    Oracle now holds 15 security evaluations. DB2 has none. SQL Server has only one.

    If it was easy to "buy" these certifications, I'm sure that Microsoft SQL server would have more than just one by now. (Granted, Oracle also has a bit of cash to throw around too).
  • ...is basically bribing some company to tell everyone your product is secure. This is fine when it comes to, say game reviews in magazines, or advertising, or whatever, but with something as serious as security issues, shouldn't there be some sort of "conflict-of-interest" legislation in place?
  • The purpose is to filter out most of the trolls, not to correct spelling mistakes or redundant stories.
  • What happens when certified software fails? Who has the liability? As someone else pointed out one hidden problem with 3rd party certifications is that it's not their security problem when it fails, it's yours. It may be their reputation on the line but you should be certifing your own security.
  • Worse yet are the new Common Criteria specifications that most NATO nations are now using as a replacement to the Red Book ratings (C1, C2, etc..) is just as borked and un-objective as FIPS.

    Namely, the Common Criteria simply specify that you must tell the certifying body what a system will do and then it must do those things. It'd kinda like the mess that is ISO-9000 that way. Worse yet, labs are paid not by the potential purchaser, but by the entity wishing to have something certified (same as FIPS). Certified products are then "acceptable" by all countries participating in the Common Criteria evaluation scheme, meaning that poor products that are certified have a higher likelyhood of doing more damage than FIPS certified products will.

    More of your tax dollars hard at work...
    • I don't see why forcing a developer to state what their product does, and then holding them to that is a bad thing. It's up to the consumer to decide whether those security functions fit their needs. It's not like a developer can create their own security functional requirements, state that their product implements them, and receive their certification. It's the responsibility of the labs and their associated certification body to ensure that certified products not only do what they say they do, but also that any explicitly stated security requirements make sense and that their evaluated configurations are useful (i.e., no NT certifications without network connectivity, no PIX certifications without NAT, etc.). So long as they uphold their end of the deal, "poor products" will be excluded. I think it's important to point out that just because a product isn't a silver bullet doesn't mean that it doesn't solve one security problem very well. Of course, integration opens another can of worms...
  • Apparently 'the testing companies make money by certifying products, not catching problems' thus implying that the seal of approval might not mean a whole lot."

    Sounds familiar. Oh yeah -- the US Patent Office makes its money off of "user fees", and it collects a bunch more when it issues a patent then when it denies one.

    Hmm...

    (The real answer may be in just charging for the application (either for certification or for patent, depending on whom we're talking about), and the cert is issued (or not) without extra fee. I say "may" be the answer because that approach might encourage agencies to trivially deny applications to boost re-application fee income.)

  • Would you but a firewall product that wasn't ICSA certified? Would you buy a crypto card that wasn't FIPS 140-1 certified? No. Absence of a certification means that either the product has a serious security flaw that the test would find, or that the vendor simply doesn't care. Either of these are reasons to drop the vendor off your list. An MCSE doesn't make someone a capable system administrator by any means, but would you ever hire anyone who didn't have an MSCE to administer MS servers?

    No certification can say a product is secure. A certification can only mean the product was tested and found compliant with standards. Security isn't an all or nothing characteristic. All other things being equal, a certified product is less likely to fail than one that was unable to pass the tests.

  • "Yes, this certificate is much more secure than a self-signed (in this case not certified) on the premise that the company or author in question gave us money to say so. Enjoy"
  • There's an open source angle to FIPS 140-x that's worth mentioning: The Network Security Services [mozilla.org] open source crypto implementation embedded in Mozilla and Mozilla-based products has been FIPS 140-1 validated, as was the original proprietary Netscape Security Services code from which the current open source NSS was derived. The validation efforts were sponsored by Netscape originally and by Sun (iPlanet) for the open source version; for more information see the list of FIPS 140-1 validated products [nist.gov] and look for certificates 247 and 248 (Sun) and 47 (Netscape).

    As others have noted, FIPS 140-x validation is not a panacea; however it does add some additional (and IMO useful) product review beyond what you'd get with standard internal QA plus public review (for open source crypto products). I think it would be great if some vendor or vendors stepped up and sponsored FIPS 140-x validation for OpenSSL and other popular open source crypto implementations.

  • The author of the Business Week article missed a key point. FIPS 140 is talking about encryption, not security. You can be FIPS 140 certified, and still have completely unpatched holes. All FIPS 140 tells you is that the crypto algorithms work correctly (including random number generation, etc). Common Criteria, as others in this discussion have noted, is much more relevant. While it's not perfect (the vendor gets to pick the security features they want to claim, and they get to pick who does the approval), at least it requires a search for security vulnerabilities, and an effort to make sure that the product is architecturally sound. Sounds to me like the people at InfoGard Labs sold Business Week on the article, and they didn't have anyone who understood what the certification is about.
  • Private Company A pays FIPS Certifier a flate rate to have A's product evaluated. If approved, it gets certified. If denied, Company A has to change things and pay to have re-evaluated. This makes the FIPS Certifier want to fail more people, as they then have to pay to have the product evaluated again and again until they get it right. Not too hard, so why does it not work that way? Am I missing something?
  • My company is involved in a standards project and we're torn between requiring self certification or testing by third parties. I'm curious to hear from someone who has gone through something like the FIPS process - is there financial liability on the part of the third party certifying authorities? That would certainly make the them more careful.

  • Seriously, there are so many certifications out there now, I'm starting to wonder if it'd be profitable for me to create my own cert. and get one of the testing centers to offer it?

    The cheapest I.T. industry cert. I've seen is still the A+, and that is a 2 part exam that sets you back around $180 by the time you take both pieces. Either the testing centers make an absolute killing on these things, or else a certain percentage of the profit goes back to the test's creator.

    I'm thinking if you even get a 4% or 5% cut of the profits on each attempted exam, you stand to make more money than you would by actually working in the industry in a job you got, partially by becoming "certified".
  • by Shoten ( 260439 )
    Certifications are a leftover from the world where everything the military used was custom-made. Back in that world, things were so rigidly specified ("mil-spec") and known to the end user, it made some sense. From the top to the bottom, people in the military or intelligence communities were involved in the design and keeping a sharp eye on the security. The frame of thought of the designers was "zero-threat," whereby they thought of everything they possily could think of, and security was above nearly all other priorities. In this environment, it's easy to make a list of things to check for, and you can actually work that way.

    Now, the emphasis is on COTS products, which have almost all been designed with the commercial (read as, "less security-conscious") market. Concern has not been for security as much as marketability, ease of use, and appeal to the public. The designers often do not keep security at a high priority, if at all until the later stages of development. And the people who really understand the nature of the threat in high-security environments had no input, insight, or awareness whatsoever of the internal workings of the products. The products are so varied and different from each other, a checklist of hard facts to verify (as certifications are) is no longer sufficient to catch all the possible risks.

    That said, the other problem is that no other method has yet been developed. It's easy to say, "Just do a vulnerability assessment." How? How do you do that on a constant basis on something the size of a government network? How do you make sure that nothing slips through the cracks? And above all else...how do you keep most of the contractors from getting in the way of the few (or single) contractor who gets the job of checking everyone else's work? At least certifications are neutral, in that they have no capability to be used by any single contractor against all the others.

    But that's just my frustration :)
  • by SiliconEntity ( 448450 ) on Thursday October 03, 2002 @02:21PM (#4382720)
    I was involved of getting our software package FIPS 140 certified, which is the major crypto security certification. I think there's some validity to the point that the certification house (which is sort of a gatekeeper to the actual certification) has something of a conflict of interest. We are paying them for the certification, while they are the ones who check the adequacy of our security measures. FIPS is supposed to check on their work, but that was largely a rubber stamp.

    Nevertheless the certification house did do a thorough check on us and did recommend a number of changes to our software. We didn't think any of them truly added security, but at least this way it was obvious that the cert company was doing their job.

    The big problem is that we got that version of the software certified, taking about eight months and several employees' time. Now a few months later we come out with a new release! We can't get re-certified every time, even though they have a shortcut for recertifications. Keeping up with the short software release cycle would be way too expensive.

    So we still have FIPS 140 certification listed as a feature of our product, but if a customer really wants that specific version, we have to sell him old software. As it turns out, no one does. All they really need is to be able to check the box that says we are certified, and then they're perfectly happy to take the latest software. The mere fact that we spent the time, effort and money to be certified is what really counts.
  • I've taken some hardware crypto products most of the way through a FIPS 140-1 level 3+ cycle, but never completed it because our customers were not willing to pay the extra money (my company had certified other products before that, and has certified some since I've left).

    I find it interesting that people are starting to specify AND demand FIPS 140. When we certified our first product, using the predecessor of FIPS 140, we only ever had one customer - the US treasury. Perhaps the chicken-and-egg situation is changing, because just enough companies have FIPS 140 certified products that customers might actually be able to buy one.

    In reality, most customers will not use the actual certified product, for a couple of reasons. First, it is too expensive (and takes too long) to recertify the product for every minor version change. Second, the FIPS process only allows certain algorithms (FIPS algorithms naturally) and certain cryptographic formats. If your product wants to support the wide-spread PKCS format (RSA pseudo-standard) instead of the government preferred ANSI formats (in cases where there is a difference), those PKCS commands will have to be disabled in the FIPS version of the product.
  • Patents Office... (Score:2, Interesting)

    by Julz ( 9310 )
    Sounds like the same problem with the US Patents Office.

    They should pay them on the number of patents thrown out.
  • From my experience, the FIPS 140 certificate does a good job of ensuring that products live-up to their formal design specifications. The obvious question is how good were the design specifications? This is where things get interesting. To over generalize, I think FIPS 140 does a good on tamper-resistant (and respondent) hardware design, and a poor job on logical security.

    A lot of the FIPS philosophy came out of the military, and the testing labs impressed me with the breadth of their physical attacks. On the other hand, the military usually has very simple logical security requirements for a crypto-box. It should be inert until authorized users properly activate it, and at that point it can perform sensitive actions. Commercial cryptography designs by contrast, usually has a set of functions that needs to be generally available. They also have a much smaller set of functions that need authorized users to control.

    When we put our product through the immediate predecessor to FIPS 140-1 certification, we were the first commercial product and ended-up breaking a fair-amount of new ground (somewhat painfully as you might imagine). What we had to show was that the cryptographic commands that were available to non-privileged users were safe - because of the logical security design. Even early FIPS 140-1 processes did not really deal with these "always-on" functions very well.

    Although it improved, especially with the 140-2 modifications, logical security is still the real weak point. Michael Bond's [cam.ac.uk] well publicized attacks on the FIP 140-1 level 4 certified IBM 4758 security module were all aimed at the "logical security" level. My favorite example of insecure by design is the PKCS #11 security module when it is used for server security.

    The Cryptoki (PKCS #11) interface was designed for security tokens, and basically works a lot like the military devices. The token (smartcard, whatever) would be plugged into the client device, where it would remain inert until activated by the user password. Actually a pretty good design when used this way.

    The problem is when the same design is used for a server, which is unfortunately common since several PKI vendors standardized on using PKCS #11 security modules. PKCS #11 authorization levels are all messed-up for server use. There is no concept of "always-on" commands, or multiple levels of authorization. That means that any entity (server application) that wants to access the security-subsystem must be an authorized user.

    The result is that the clear password that enables the PKCS# 11 modules has to be put into the server application. Because of that clear password an attacker no longer has to break into the PKCS #11 box or steal/forge authorized user's identities. They can gain authorized user privileges merely by monitoring the communication lines between the application and box, or by analyzing the object code of the application!

    You will find a number of FIPS 140 certified PKCS #11 [nist.gov] modules, which is actually no surprise given how well PKCS #11 matches the military origins of the FIPS 140. This is a classic example of a certified subsystem that is quite secure for some uses (human insertion of a token and entry of password), but it quite insecure for others (server applications storing and using clear passwords). All the FIPS certification does in the case of PKCS #11 is tell you that the vendor has followed their design, and not if it will provide logical security in your system!
  • Product certicications are usually based on high level process requirements - not details/features in products. This means that while process may have been followed, all problems may not have been caught. The certification producer, and the certified entity will market the certification to people, but people should really try to be aware of what it takes to be certified and to understand the field before interpreting the meaning of a cert. Usually certifications miss very important details in an attempt to be a broad catch all and it's difficult to understand what is missed without having been there. At the very least, people should be seeking commentary from people who HAVE been there but no longer have a vested interest (to taint an opinion)

    Also, unless you have achieved a particular certification, I don't think that you're qualified to comment as to its real meaning. How do you have any clue as to what it means unless you know what you've learned from it, and how you've grown in order to achieve it? Even then, people tend to lie to themselves.

    Certificiations are not meaningless. But they certainly don't mean competence. Compentence is NOT knowledge. For example... I recently worked with a man, who was CISSP certified who had no clue. Someone posted a usenet article asking what a firewall deny, outgoing, to port 4000 was probably caused by (complete with destination IP, etc)... he thought that it was a trojan (this was "here's a question, go think about it for a while" type question, not a "GIVE ME AN ANSWER NOW!" question). Now, if you don't recognise port 4000, you could always reverse DNS the IP and find that the target host was icq.mirabilis.com. This is just one example - so don't just say "he was lazy and didn't invest the time in figuring it out. The man also claimed to have written White Rabbit, played with Jefferson Airplane (on stage), invented robots that could climb stairs and learn rooms by name and that he flys to Greece in the summer in a bomber (he says that its his uncle's and that it costs ~2k in gas). Obviously, he was fired, but he had a CISSP (fairly big security compentence certification).

    Now, if this post sounds non-sensical, please, understand that I am currently a bit drunk, and may not be making a lot of sense. :) Thanks.
  • A novice of the temple once approached the Chief Priest with a
    question.
    "Master, does Emacs have the Buddha nature?" the novice asked.
    The Chief Priest had been in the temple for many years and could be
    relied upon to know these things. He thought for several minutes before
    replying.
    "I don't see why not. It's got bloody well everything else."
    With that, the Chief Priest went to lunch. The novice suddenly
    achieved enlightenment, several years later.

    Commentary:

    His Master is kind,
    Answering his FAQ quickly,
    With thought and sarcasm.

    - this post brought to you by the Automated Last Post Generator...

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...