Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security Businesses

Should Auditors Be Liable For Certifications? 209

dasButcher writes "Enterprises and mid-size business rely on auditors and service providers to certify their systems as compliant with such security regs and standards as PCI-DSS or SOX. But, as Larry Walsh speculates, a lawsuit filed by a bank against an auditor/managed service provider could change that. The bank wants to hold the auditor liable for a breach at its credit card processor because the auditor certified the processor as PCI compliant. If the bank wins, it could change the standards and liabilities of auditors and service providers in the delivery of security services."
This discussion has been archived. No new comments can be posted.

Should Auditors Be Liable For Certifications?

Comments Filter:
  • by fractoid ( 1076465 ) on Thursday June 04, 2009 @04:07AM (#28206881) Homepage
    TFA makes a very good point:

    What will be interesting about this lawsuit is how the court assigns responsibility for a breach at a certified business. Audits, by their very nature, are point-in-time or snapshot checks. They cannot account for the dynamic variables of business and IT operations that may weaken security over the long-haul.

    If they win this lawsuit, they're setting a dangerous precedent - anyone who at any stage has certified a system as secure becomes responsible for its ongoing security, and can potentially be held liable for stupid user errors by users of that system.

    • by Renraku ( 518261 ) on Thursday June 04, 2009 @04:14AM (#28206911) Homepage

      Inspectors of things like elevators are not responsible if their target checked out at the time of inspection, and later failed. For example, you could sign off on the construction of a bridge or an installation of an elevator because everything looked good, but when the bridge company doesn't maintain the bridge properly or the elevator company fails to do the same, the inspector is not held liable, even though they were certified as good.

      Auditing a network should be the same way. Of course, an auditor should NOT be held responsible for undiscovered bugs or holes in software. Instead, their job should focus on general security. It would be like a bridge inspector trying to certify a bridge when the gravitational constant of the universe were in a state of flux. How do you guarantee that steel is the best material or that the iron won't suddenly turn liquid at room temperature? That about sums up the state of software development and bug discovery.

      • by Anonymous Coward on Thursday June 04, 2009 @04:26AM (#28206975)
        but if the bank could demonstrate that it followed avery step without failing any of the certified process, then the blame would be on the certification authority - if the bridge of your example was built using a low quality concrete and falls, (an illegally low quality of concrete) then the inspector which allowed for that concrete to be used should be liable for the bridge fall.
        • by Smidge204 ( 605297 ) on Thursday June 04, 2009 @05:43AM (#28207247) Journal

          So in other words, if the bank can demonstrate that the cert authority didn't do its job properly, the cert auth can be held liable?

          Sounds about right to me.

          I'd like to see the certs creep up the line of development. Software used for high security applications should be certified at the developer level, and the installation and implementation of that software should be certified at the implementation level.

          To continue the bridge analogy: The contractor needs to be licensed and insured, just as the inspector needs to make sure the materials and methods used are up to spec. Are developers held responsible for the quality of their products?
          =Smidge=

          • Re: (Score:3, Insightful)

            by Anonymous Coward

            Sounds like you're assuming that being PCI compliant is in fact the same thing as being 100% secure, which is retarded. They were supposed to make sure the servers were PCI compliant... that is all.

            • Re: (Score:3, Interesting)

              by mindstrm ( 20013 )

              PCI covers more than just servers ---- it covers physical security, staff identification, physical access to paperwork, disposal, data retention, lots of corporate policies.......

            • Re: (Score:3, Interesting)

              by ??? ( 35971 )

              And they failed to do that.

              They knew the processor had previously failed an audit because of storage of unencrypted PANs and non-compliant firewalls.

              They provided an audit report that said "fully compliant" with CISP.

              In the aftermath of the breach, it was discovered that the processor still had non-compliant firewalls and was still storing unencrypted PANs.

              It appears that Savvis did not do their job. This will not be the big question at the trial, though.

              Merrick was not in contractual privity with Savvis.

          • Re: (Score:3, Interesting)

            Comment removed based on user account deletion
            • Re: (Score:3, Informative)

              by Nikker ( 749551 )
              PCI compliance is mostly about network security and infrastructure, such as ensuring networks that service secured endpoints are isolated from networks that aren't. The auditor is really only there to attempt to mitigate and isolate known security issues that most shops don't bother to take too seriously. By starting this buck passing all you are really doing is starting a new age of insurance that you will need to take to cover the possible fraud that can take place rather than working with the banks to
              • You really don't seem to grasp how this works in other industries.

                (Disclosure: My only assumption here is that the scope of the auditor's responsibility is clearly defined. If it isn't, then the auditor is just asking for a lawsuit.)

                If it can be demonstrated that the cause of failure was outside the scope of the auditor's contract, then he would not be held liable. For example, the auditor would probably not be responsible for the locks on the doors to the data center - so if someone breaks in and steals th

                • Re: (Score:3, Insightful)

                  by Zerth ( 26112 )

                  PCI is just troweling mortar on a crumbled foundation. Sure, it covers all the really boneheaded stuff, like using decent authentication and applying patches, but there is no part of it that says "don't use badly made(but it is expensive, it must be good) software on a fundamentally broken OS"

          • Re: (Score:3, Funny)

            by JumpDrive ( 1437895 )
            Are developers held responsible for the quality of their products? Yes, Microsoft developers are held responsible for the quality of their products, can't you tell.
        • You are correct, the inspector would be liable, not only for the failure of the bridge, but for failing in his duties as the concrete used was inferior... The big difference, its not hard to test concrete, that is one aspect of bridge building... In IT, its completely different, its not tangible (well in some cases like physical security it is), but an inspector or auditor could have done his job perfectly fine, and things still went wrong as IT rules and policies, and configurations, and patches, and thin

      • by ArsenneLupin ( 766289 ) on Thursday June 04, 2009 @04:44AM (#28207037)

        How do you guarantee that steel is the best material or that the iron won't suddenly turn liquid at room temperature?

        Better analogy would be, how do you guarantee carbonated steel doesn't turn brittle in icy waters [encyclopedia.com] or how do you guarantee that the wind doesn't induce fatal vibrations matching the resonant frequency of the bridge [wikipedia.org].

        Indeed, bugs do exist at the time of inspection, they are just not (yet) known. No change of laws of physics is required, only discovery of yet unknown (or underestimated) effects.

        • by Ihlosi ( 895663 ) on Thursday June 04, 2009 @05:26AM (#28207203)
          how do you guarantee that the wind doesn't induce fatal vibrations matching the resonant frequency of the bridge.

          Quote from the linked page:

          "In the case of the Tacoma Narrows Bridge, there was no resonance."

          That bridge came down due to a profoundly nonlinear positive feedback effect (the deformation caused by the wind increased the area of attack, which lead to more deformation, etc), not due to the bridge resonating.

          • Damn Wikipedea sucks balls.

            Some moron gets it into his head that the Tacoma Narrows bridge failed due to 'aeroelastic flutter' not resonance. The definition of 'aeroelastic flutter' begins with the description:

            Flutter is a self-feeding and potentially destructive vibration where aerodynamic forces on an object couple with a structure's natural mode of vibration to produce rapid periodic motion. Flutter can occur in any object within a strong fluid flow, under the conditions that a positive feedback occu

        • Re: (Score:2, Informative)

          by mysidia ( 191772 )

          Except those two specific conditions, and in theory (how to prevent them) are well-known.

          The unknown bugs software has are new cases entirely that cannot be examined a priori like a bridge's aerodynamics can.

          • Those conditions are well-known now. Nobody at the time realized why "Galloping Gertie" was so lively, and nobody had good evidence that it might destroy itself. A bridge inspector, using techniques and knowledge available at the time, would have certified that bridge.

      • Ditto for MOT tests [wikipedia.org] in the UK (from what I've heard from my sister-in-law in the US the Americans don't have a similar "road-worthiness test"). The MOT says your car is safe to go on the road, doesn't have emissions that are too high, etc, but it also says that it is a one-off test and that it doesn't make any guarantee of on-going quality. Just because the garage checked the car over on Monday and thought it was okay doesn't mean it is okay on Friday after the driver has run it over large curbs or high-spe

        • Re: (Score:3, Informative)

          from what I've heard from my sister-in-law in the US the Americans don't have a similar "road-worthiness test"

          It's up to the individual states, but most states have them. Here in Virginia, I have to get my car safety inspected once a year (and carry an inspection sticker on my windshield) and emissions tested once every two years (or they won't let me renew the car's registration).

        • Comparing mechanical devices like a car, that have parts that wear down to a network which is not susceptible to the same pressures is not completely fair. If my mechanic certifies that my car passes the state safety inspection (which we do have in the US) on Monday, and I suffer a catastrophic failure of one of the inspected parts on Friday, then I might have a case. In six months, I probably don't.

          I see inspecting/certifying a network as being a little different. If I certify that your network meets a

          • But it's not like a change to the network's configurationis terribly difficult to cause. One employee downloading & running malware on their desktop constitutes a change in its configuration.
            • You are correct that malware running on the network is a serious threat. The point I was trying to make is that if an auditor certifies that your network is protected from various type of malware attacks, then they could be held liable if you hacked in this manner.

              I will admit that this is a very gray area, but if you offer your services as a network auditor, then expect to be held liable for failing to anticipate common threats. You should not just be auditing a static network at a single point in time,

    • "If they win this lawsuit, they're setting a dangerous precedent"

      Audits are performed so the company can demonstrate due dilligence should something go wrong, if the auditors themselves cannot show due dilligence in their own actions then they deserve to be hammered.
    • by noundi ( 1044080 ) on Thursday June 04, 2009 @04:45AM (#28207041)
      I highly doubt that's even the case. The bank would probably have to prove that the breach could have taken place even at the time of auditing, not after, due to obvious reasons anyone can imagine. If they manage do to so the suit should be perfectly valid.
    • by Tom ( 822 ) on Thursday June 04, 2009 @04:52AM (#28207067) Homepage Journal

      If they win this lawsuit, they're setting a dangerous precedent - anyone who at any stage has certified a system as secure becomes responsible for its ongoing security, and can potentially be held liable for stupid user errors by users of that system.

      Contrary to the precedent that no matter how much you fuck up, and no matter how blatantly false your audit report is, you're not responsible for anything, including not finding problems that are there when your whole job justification is that you're there to find these problems?

      Stop worrying about the poor little techie. We're talking commercial enterprises here. The immediate effect will be that auditing companies take out insurances to cover this risk, and the price of audits goes up a little. However, the secondary effect will be that audits do, in fact, improve, because the premiums on your insurance depend on how often you fuck up and the insurance company has to pay for it.

    • by Rogerborg ( 306625 ) on Thursday June 04, 2009 @04:53AM (#28207071) Homepage

      If they win this lawsuit, they're setting a dangerous precedent

      How so? The principle seems clear enough that any audit, in any industry, is only a snapshot; why would you think a court would change that principle in this case?

      The article indicates that the system wasn't CISP compliant at the time of the breach, but presumably Merrick can only prevail if they can show that the non-compliant that allowed the breach was also in place at the time of the audit. Do you think otherwise? If so, what leads you the conclusion that the sky is about to fall?

    • Re: (Score:3, Insightful)

      by asdf7890 ( 1518587 )

      If they win this lawsuit, they're setting a dangerous precedent - anyone who at any stage has certified a system as secure becomes responsible for its ongoing security, and can potentially be held liable for stupid user errors by users of that system.

      IMO it depends on where the fault lies.

      If the fault that allowed the problem is a property of the system that an auditor or penetration tester could be reasonably expected to have picked up on (such as password complexity and cycling rules not being present or not being correctly enforced) then maybe the case is valid.

      If on the other hand the problem is outside the system that was audited (i.e. the breach was due to a user having stored/transmitted a copy of their credentials insecurely, or due to users/adm

    • by Z00L00K ( 682162 )

      This is an interesting case to have.

      • If the auditor certifies a system according to current regulations and the system later fails. Is that the fault of the auditor or the regulations?
      • System changes can render the certification invalid and then the system has to be revalidated.
      • New threats and hacking methods appears all the time, so even current regulations may be outdated.
      • You shall never certify your own system, always bring an outside certification agency. Then it's up to you to take action and responsibil
    • Re: (Score:2, Informative)

      "Audits, by their very nature, are point-in-time or snapshot checks."

      8 years military service here. Security was 24/7 plus when I was in uniform. There was no "snapshot" of security, because everyone was trained from day one to understand that a moment in time is meaningless.

      I have always laughed at the concept of "security" in most of the civilian world. Seldom have I been in any civil institution where real security measures were in place, and enforced - be that physical or electronic. Oh, there ARE p

      • Re: (Score:3, Interesting)

        by Opportunist ( 166417 )

        Security is a 24/7 process. Audits are snapshots thereof.

        There are quite a few companies that dread and fear their 9001 or (even more) 27001 renewals because they are "so much work". Yes they are, if you're not sticking to the certification requirements (which you technically have to, after all that's what the sheet of paper that you get certifies).

        Every time a company moans about "certification work", I question their certification worthyness.

        • We seem to have a dysfunctional definition of the word "audit". When the IRC audits an individual, or a company, they look at the overall picture of earnings for the year, or for multiple years. The intent is to itemize EVERYTHING, and to ensure that everything is accounted for. The IRS has a functioning definition of the word "audit". There is little chance of hiding or obscuring anything that has happened in the period of time being audited.

          A security audit should serve much the same purpose. Whether

          • Should I be employed to audit your security, I'll NOT show up at your jobsite for a day, then send you some half-assed report, stating that your recorded procedures look good. That is nothing more than rubber stamping what your management has already decided to implement.

            Then don't quit your day job and hope to be employed as an auditor. The sad truth is that companies don't want security. They want certificates. And they will go with whoever gives it to them with the least amount of work necessary.

    • If they win this lawsuit, they're setting a dangerous precedent - anyone who at any stage has certified a system as secure becomes responsible for its ongoing security

      No, to win, they will presumably have to prove that their systems weren't compliant at the time of the audit. All the TFA says is that the later investigation showed non-compliance - it gives no indication as to the nature of this problem.

      Say I inspect your security, claiming to be an expert, and a few weeks later you have a breach. If, after the inspection, someone re-set a password to something lame and/or left it on a post-it than don't blame me. If, however, it turns out that your wireless router does

    • by D3 ( 31029 )

      First, it is way too easy to hide information from the PCI assessors. BTW, they are NOT auditors, they are assessors, there is a big difference. But it is too easy to hide stuff because to really dig into a complex system for every last detail is already cost prohibitive.

      Which brings me to my second point. If liability gets pushed to the assessors (or SOX auditors which are real auditors) then the cost of being assessed/audited are going to skyrocket because they will just pass the cost of liability righ

    • The thing that really needs to be taken into account is that just because your certified does not guarantee 100% security. The auditor should not be held responsible for this, all they do is check to see if you are compliant with a standard. If they want 100% secure they should unplug it , put it in safe and then drop it into the Marianas Trench.
    • Not to mention, since when does "conforms to a particular security standard" equal "impregnable"?
  • by siloko ( 1133863 ) on Thursday June 04, 2009 @04:07AM (#28206883)
    Well much as I like people to be held responsible for the quality of their work I think it is a bit much to expect technology certification experts to be held responsible for the dufus who puts his username and password on a PostIt stuck to his monitor . . .
    • Where's the problem? That dufus broke code and cert standards. If he did, that is...

      Bottom line: If there's a line in the audited security standards that reads "writing down your password is forbidden", chop the dufus' head for breaking code. If it's not, chop the auditor's for being the dufus.

  • Kind of. (Score:5, Informative)

    by Renraku ( 518261 ) on Thursday June 04, 2009 @04:08AM (#28206889) Homepage

    If an inspector inspects and then signs off on an elevator, and the elevator subsequently catastrophically fails due to some reason the inspector should have caught, the inspector can be held liable, unless they can show that his inspection was somehow tampered with. Like perhaps the safety interlocks were just for show and didn't have any real parts inside of them.

    Auditors should be held to the same standard, and given the same rights to defend themselves.

    I don't want to sound harsh, but considering people pay auditors to do a job, if the job isn't done right, they need to suffer the consequences.

    • Re:Kind of. (Score:5, Insightful)

      by wirelessbuzzers ( 552513 ) on Thursday June 04, 2009 @04:20AM (#28206941)

      I agree, but it's hard to say what standard auditors should be held to. Often, computer security audits are just surface level checks: they check your design docs and your testing methodology. And this is fine, but you get what you pay for. If a bug slips through your tests, or worse if you don't actually implement your design docs or tests, the auditors obviously shouldn't be liable. On the other hand, if there's a flaw that the auditors "should" have caught, and they don't, they should be liable at least to some degree.

      The difficulty is that full, in-depth code audits are very, very hard. Consider the Linux kernel or OpenSSL: even after 16 years of "many eyes" treatment by engineers and security researchers across the world, serious bugs keep showing up. As a result, the fact that the auditor missed something doesn't mean much, and it's not clear that a court will be able to decide whether the auditor "should" have caught it.

      I wonder if the same problem is present in other industries.

      • Re: (Score:3, Interesting)

        by Opportunist ( 166417 )

        Oh tell me about it.

        I've done more than one security test for companies that boast 27001 certs, only to succeed with the most basic systems of social engineering or inside jobs. More often than not I get paid to shut my mouth rather than talk about how stellar the security of the company is.

        The general reactions are quite different, too. Some companies are genuinely interested in security and they're quite happy when you find a loophole in their process. Most, though, just want a signed paper and get rid of

    • Re:Kind of. (Score:5, Interesting)

      by Rosco P. Coltrane ( 209368 ) on Thursday June 04, 2009 @04:23AM (#28206961)

      You're correct that, if an elevator cable is frayed and the auditor missed it, he should be sued. However, audits aren't a way for businesses to shift the blame onto the auditor: they're a way for honest businesses to confirm that everybody (employees and contractors) and everything is in order at a certain point in time. If the auditor finds something that isn't right, his job is to inform his client, and perhaps propose remedies, but that's all. It's the business' job to implement the remedies. What I mean is, audits are a tools *for the client* to help do things right, that's all.

      For instance, I once subcontracted in a company that used all manners of cracked software. A day or two before the IT audit was due, the manager used to go around telling employees to uninstall anything shady and put away copied CDs. The auditor would come, say everything was good, and the day after, all the cracked software were reinstalled. Is this the auditor's fault? The problem in this case is that the company needed the audit to be this-or-that-certified, in order to work for a certain customer. They didn't see the audit as a tool to help them do business better, but as an annoyance that could prevent them to do IT on the cheap.

      • by olman ( 127310 )

        You're correct that, if an elevator cable is frayed and the auditor missed it, he should be sued. However, audits aren't a way for businesses to shift the blame onto the auditor: they're a way for honest businesses to confirm that everybody (employees and contractors) and everything is in order at a certain point in time. If the auditor finds something that isn't right, his job is to inform his client, and perhaps propose remedies, but that's all. It's the business' job to implement the remedies. What I mean is, audits are a tools *for the client* to help do things right, that's all.

        Nb. Following pertains to european regulations. Yes, it's the continent not on same map page as US.

        On the subject of auditing machines and devices, now the demand is suddenly the auditor/inspector has personally checked every single critical component making up the elevator? And their installation?

        Right. Maybe Superman with his X-Ray vision could give decent go-ahead on an installation with superficial examination and checking the paperwork. Ordinary people obviously can't.

        However much you'd love to shift t

      • by Aladrin ( 926209 )

        It occurs to me that the loss of productivity for those few days around every audit is probably costly enough to just pay for all the stolen software. This is a really, really bonehead move.

      • My experience is pretty much the same, audits are done when a company need a certain cert to get a (often public) contract or to comply with legal requirements. Nothing else. No company I had been involved with in audits did it "on their own", because they wanted to better themselves. Audits and the certs that come with them are usually seen as a necessary evil, not something desireable.

    • Re:Kind of. (Score:5, Interesting)

      by Xest ( 935314 ) on Thursday June 04, 2009 @04:41AM (#28207029)

      The problem is that auditors only check something at that point in time. They can't check that things are correct on an ongoing basis and they can't help it if what they're checking against isn't foolproof.

      I used to support IT in schools, and was sent on a PAT testing (http://www.pat-testing.info/) course so that I could PAT test equipment in schools. One thing that was made clear on the course was that if we are not willing to do PAT testing we do not have to even if our employer tells us to. Why? Because if you sign off a piece of electrical equipment as safe and someone injures themselves because it wasn't safe a day later you could be liable - that sounds fair enough at first read through, but what if it really was safe when tested but something happened after testing, before the incident that led to it becoming unsafe? How can you as an tester foresee that? I actually refused to do PAT testing because of this, I simply was not willing to sign myself as liable for something I could not control.

      Furthermore, many auditors for example, security auditors can check to ensure a company is complying to security policies, but what if those policies are flawed and a breach occurs because of that? The auditor was paid to ensure policies were followed, and it is the company that is paying for that who is at fault IMO if the policy wasn't enough. Say an IT security policy states that all security patches should be applied immediately, that's great, a security auditor could check that, but what if then there's a breach using a vulnerability for which there was no patch? Is it the auditors fault?

      To me it's the company's fault again, the real problem is this, companies don't want to spend time and money on things they see no instant benefit from such as following security policies and procedures. They do the bare minimum they can and comply with the policies and procedures they have to - knowing full well that these policies and procedures are the bare minimum and insufficient for real security and good practice. There's always more that can be done, allowing them to shift the blame just means they'll struggle to find auditors.

      Auditors do what auditors are supposed to do, if auditors do their job wrong then sure they should be liable, but I do not see how you can make them liable for something outside their remit. If you pay someone for a full security audit it's one thing, if however you pay them to ensure you're BS7799 compliant and you don't do anything over and above that but suffer a breach as a result of the fact there are things you can do over and above BS7799 then it's your companies fault.

      The answer has to come down to the auditor's role, and if the auditor has audited what he's supposed to he should not be at fault. It is only when the auditor has accepted to do an audit and signed it off and that his audit was found to be at fault that he should be liable. In the example of the lift you state though, there is no way that we can know if the auditor was at fault, if he tested it and it really was safe, how could he be at fault if say over night a minor earthquake occured making the lift not safe? What if because of the nature of it he can't prove that it wasn't like that when he tested it? Should he be jailed for manslaughter? When he did nothing wrong at all, should he even have to suffer having his name dragged through the mud, possibly being suspended from work/losing his job in the process until he's finally found not guilty even though his life is wrecked anyway?

      Companies should be held liable anyway, if a company gets screwed by a bad auditor it should be on the company to prove the audit itself was faulty. In other words, let's stick to innocent until proven guilty. If a company feels the auditor is guilty, let them prove it, not vice versa.

      • by Renraku ( 518261 )

        Correct. Audits and inspections are always point-in-time snapshots of the state of whatever is being audited or inspected. It should be held accordingly. Auditors and inspectors can not be held accountable for things found out after the inspection. Like that steel really does have shitty 60-year durability specs or that bind is a buggy piece of shit.

        All of that being said, there's no reason an inspector should sign off on a system with open shares and no firewall or a bridge with eroded foundations.

        • by Xest ( 935314 )

          "All of that being said, there's no reason an inspector should sign off on a system with open shares and no firewall or a bridge with eroded foundations."

          Well, as I say, I guess that depends if he's being paid to do a general security audit, or if he's being paid to ensure the company adheres to a specific standard, and the standard in question doesn't specify that he should check that. I think as I say, the issue is, a lot of standards are quite weak, although maybe not to that extreme, some companies seem

          • That doesn't cut it with me, and a company losing my personal details to criminals saying "yeah but we followed standard X" doesn't cut it with me if standard X wasn't truly sufficient to protect my details and more could reasonably have been done.

            Don't blame the company, blame the government for not requiring a better standard.

            • by Xest ( 935314 )

              But isn't that just the same problem I mention? The idea you can get away with something as long as it's in the rules even if the results are extremely damaging as per the MP expenses scandal in the UK?

              Should we really have to expect the government to legislate everything? Isn't it better that companies are running scared over this sort of thing, such that when they can't be bothered to do the important things properly - like security and something goes wrong, that they're held liable?

              I fear if we let compa

              • I'd just fear that the fallout of a "company is liable for everything" formula would be that small companies would have to bear the brunt of it while large corps managed to weasel out of it.

                A company has to store certain personal data of customers, it's a basic necessity of business. You have a list of customers, you have your supplyers, you have to store what they owe you and what they paid. Most of it is even required by law, so you can be taxed accordingly. So the mantra "don't store data and you're safe

                • by Xest ( 935314 )

                  I'm not convinced they do. Certainly there's absolutely no reason they need to store your credit card details for example unless you explicitly ask them to for your convenience, yet some do store them.

                  There's also little reason stuff that needs to be stored for archive reasons can't be shipped off to an offline system. At very least, a backend providing data to the front end should limit the amount of data that can be pulled so that if suddenly the web server requests massive volumes of customer data (i.e.

      • Re:Kind of. (Score:5, Informative)

        by Tom ( 822 ) on Thursday June 04, 2009 @06:12AM (#28207385) Homepage Journal

        The problem is that auditors only check something at that point in time. They can't check that things are correct on an ongoing basis and they can't help it if what they're checking against isn't foolproof.

        The elevator guy has the same problem and yet it works in real life.

        That is because in any real life situation, tests are indeed done repeatedly, such as every quarter, every month - or if they are really important, every day or every event. No plane in the western world takes off without the pilot and co-pilot having run through a standardized checklist first.

        "But things can change" is a pretty bad excuse. Like the elevator (where wear and tear change the physics constantly), your system has to be resilient enough to withstand normal changes (e.g. wear and tear, different weights, etc.) at least until the next check. Unauthorized changes have to be hard to make unintentionally (that's why there's no "cut the cable" button inside the elevator).

        It really isn't that hard. It works in thousands of areas, many of whom are non-trivial and technically complex (e.g. airplanes). But for some reason, we think it's impossible to do it in auditing and software?

        • by Xest ( 935314 )

          I think you're largely missing my point in the context of your responses.

          In real life, the situation with elevators and planes work precisely because the companies are held responsible when something goes wrong as I stated.

          But this isn't what's being suggested. What's being suggested is that a company should be able to pay for infrequent checks on something that needs to be checked frequently and blame the person who performs those infrequent checks because the infrequent checking led to an unacceptable out

          • by Tom ( 822 )

            I advocate the model used in the airline industry, where things that are critical and important should be checked regularly and that companies should not be able to cut costs by avoiding infrequent checks whilst avoiding responsibility for this.

            Ok, I'm with you on that.

            You're right, it can work in software and auditing, but not on the cheap as most companies are trying to get away with. It's going to cost just like it does in the airline business to have their own set of technicians doing the checks round the clock, they have to accept that.

            Absolutely. It doesn't even have to be horribly expensive. One less trip to a nice cozy beachside "meeting" for upper management would probably save enough to pay for the entire thing.

        • "Things can change" is actually a pretty good excuse when it comes to IT security. Because they do change, and quickly they do. Here it's reverse, it's usually not the system that wears and tears, it's the standard that becomes a joke when pitted against ever changing and evolving threats.

          • by Tom ( 822 )

            "Things can change" is actually a pretty good excuse when it comes to IT security. Because they do change, and quickly they do.

            That's vastly overestimated. The majority of security problems are not the 0day exploits. The majority of security problems are old bugs, outdated software, bad procedures and users not trained in or unaware of security issues.

            • By and large true, what I meant is something different. Most security certs don't take into account things like buffer overflows and generally malformed user data. For example, you usually don't see any procedure for things like how to handle data files because they were not considered a threat when the certification standards were assembled.

  • But there'll be an indemnity or escape clause in their contract with the processor.
  • Costs... (Score:4, Insightful)

    by Bert64 ( 520050 ) <bertNO@SPAMslashdot.firenzee.com> on Thursday June 04, 2009 @04:15AM (#28206919) Homepage

    All it will do, is make future certifications 10 times slower, more invasive and more expensive... This bank is shooting themselves in the foot because they will have to get themselves certified again in the future and will be expected to pay a hefty premium.

    Besides, the auditor merely certifies that a particular defined system complies with a given spec at a point in time... They don't assert that the setup is secure, merely that it complies with the letter of the standard, and most of these standards are poorly written with loopholes big enough to drive a truck through.

    Not to mention that there are ongoing changes, such as patching and updates to signature files etc, do you need to recertify every time a minor change is made? A minor change could introduce vulnerabilities, for instance a security update could introduce new features and bring with it new exploitable issues while it also fixes an older issue.

    How widely do you define the scope? ideally you would include absolutely everything associated with the system, so every workstation used for admin purposes, every inch of cabling etc, this would make the scope very large and costly to deal with.

    And how about the age old question of human error? No matter how secure a system is, an error (or intentional attack) by the legitimate users could break things in all manner of ways.

    • Not to mention that there are ongoing changes, such as patching and updates to signature files etc, do you need to recertify every time a minor change is made? A minor change could introduce vulnerabilities, for instance a security update could introduce new features and bring with it new exploitable issues while it also fixes an older issue.

      That's a problem with people relying on EALs [wikipedia.org] to assure their hardware and software in high security environments. If Oracle 9 got EAL5 accreditation, you can't just tak

    • by bradley13 ( 1118935 ) on Thursday June 04, 2009 @05:21AM (#28207177) Homepage

      The question is: does a certification have a value, or not?

      Consider an example in a different area: accounting. At the end of the year, a public corporation must have its accounts certified by an auditor. The audit essentially states that the accounts are an accurate reflection of the company's financial state - that the accountants haven't "disappeared" a few million dollars into their private accounts, or whatever.

      If the accounts turn out to be fraudulent, the auditors have failed - and it is entirely correct to sue them.

      Back to IT certifications: if the audit missed something, then it is entirely appropriate to sue the auditors. If the security breach was not due to problems the auditors should have caught (inside job, violation of established procedures, etc.), then the auditors should not be liable.

      Consider what happens if you do not hold the auditors liable: a very current example from the financial world. The ratings agencies said that derivatives based on sub-prime mortgages were top-quality, low risk investments. Screwing up a rating costs them nothing, so they gave in to political pressure and rated these derivatives too high. Had they been liable for the consequences of their ratings, they would have done a better job. At least, one would like to think so - sadly, there is no way to go back and test this hypothesis...

      • Actually, a sensible standard would consider inside jobs and make them harder. Of course, if your IT administrator is the culprit you're out of luck, but nobody else should have access to data that isn't meant for him to see.

        The bigger threat is that attack vectors change, and established standards of 3+ years ago couldn't take them into consideration.

    • Will any certification companies even work with them now? It is the right of any business to refuse service to any other business correct? If they cannot find a company to certify them in PCI then they will loose their rights to process cards right?

  • by getuid() ( 1305889 ) on Thursday June 04, 2009 @04:21AM (#28206953)

    Should the auditor be liable for mis-certification? Or for the (correctly) certified system not withstanding attacks?

    I think people should *very* hard try to distinguish between the two scenarios:

    1) An auditor certifies a system as XY-compliant as of [insert date here]. However, it can be demonstrated that the system was *not* XY-compliant at that date.

    2) An auditor certifies a system as XY-compliant as of [insert date here]. However, at a later date, the system breaks for some reason. It can be proven that the system was XY-compliant, but for some reason (stupid user interaction?) is not anymore. Or, even better: it can be proven that the system *still* is XY-compliant, but the XY-standard is unfit to defend [insert attack here].

    I think in case (1) the auditor should be held liable, since he obviously certified something that didn't meet the promised standards. However, in case of (2), not the auditor is to blame. If the system breaks despite of the certification, then it's not the auditor's fault -- it's how things work, and making a scapegoat out of the auditor is not going to do anybody any good. Even worse, if the system fails to meet standard XY because a stupid user (or admin, for that matter) interaction *after* the certification, then there's no way an auditor could have prevented that -- it's either the user/admin's fault for interfering with a certified system, or the standard's fault for not defining what a user/admin is allowed to do with the system without interfering with its certified qualities.

    • Re: (Score:3, Insightful)

      by Tom ( 822 )

      I think people should *very* hard try to distinguish between the two scenarios:

      I think people should try harder to understand auditing.

      Static audits are a thing of the past. Every audit and compliance proceduree AD 2009 includes not only checks of the current system state, but in fact puts more of a focus on changes. More precisely: Change management. In a properly certified and audited system, it ought to be impossible to change the system in a compliant way into a non-compliant state. Either your changes are part of the proper change procedures, or they are not. If they are not, the

      • And any auditing that (2009) gets signed off without containing change management should never have been signed off in the first place, so again the auditor is clearly at fault.

        (I'm asking out of curiosity, not to troll you :-)

        Maybe I'm mistaken, but isn't *any* auditing a check of the state? Even a check of a process (for example an audit checking the change strategy) in fact checks the *state* of the rules to be followed when applying a change. Doesn't it?

        Now: what's the job of an auditor? Is he (a) to certify that a certain system/proces/whatever meets a given standard, or (b) is he to certify that a system/proces/whatever *is* something? (Think: is "unbreakable"...).

        I always t

        • by Tom ( 822 )

          Maybe I'm mistaken, but isn't *any* auditing a check of the state? Even a check of a process (for example an audit checking the change strategy) in fact checks the *state* of the rules to be followed when applying a change. Doesn't it?

          Yes, and no.

          What usually happens is that you define your target state, and check if the current state matches. If not, you add remediation plans to make it match or reach the objective in different ways.

          So while you do check your process at time X, the actual control says something like "A change management process tracks, documents and authorizes all changes to the system." while a second control says something like "The system is checked every X days/weeks/months to verify that no unauthorized changes hav

  • Wow (Score:5, Insightful)

    by Peregr1n ( 904456 ) <ian.a.ferguson@gmail.com> on Thursday June 04, 2009 @04:27AM (#28206981) Homepage
    The big banks really are intent on shooting themselves in the foot. If they hold the auditor liable for security breaches, nobody else will be willing to offer certification services for PCI-DSS. And considering that it's the banks who desperately want everyone to be PCI-DSS compliant (does anybody other than the banks get any benefit from it? Really?), that is particularly stupid.
    It's hard enough achieving compliancy as it is - whenever we get near to completing the questionnaire, they change all the questions!
    • by umghhh ( 965931 )
      Maybe the security really should cost more. I do not mean throwing money at the problem in hope it goes away but any reasonable security system will costs money - for audits and for implementation of findings etc. Maybe exactly that was a problem - TFA does not say exactly why they sue - maybe they know the auditors failed to inform them about something or they just want to fork the costs of the security breach and its repair.
  • by hugetoon ( 766694 ) on Thursday June 04, 2009 @05:11AM (#28207135)

    After conducting an audit of a Merchant et a PSP (payement service provider), a QSA (qualified security assesor) issues a ROC (report on compliance to PCI-DSS) that is submitted du issuers (VISA, Mastercard, Amex, JCB and Discover).

    Then the issuers certify the auditee.

    An individual can not be a QSA by itself, it has to work in an organization that is qualified as well. Among other things a QSA organization has to provision a HUGE amount of cash in case it is found liable of having unduly declared an auditee compliant.

    When a breach occurs, there is an investigation and eventually it is found that the ROC was not accurate by the time of the audit in such case the QSA organization and the QSA individual are in trouble.

    BTW a certification is only for one year.

    Now the case is not about PCI-DSS but "Cardholder Information Security Program" (CISP) and the breach happened in 2005.
    Therefore I think the outcome would not have much impact on PCI program where liabilities are well defined.

  • Seriously. What else are certificates good for? If it's just "drop some money so we send you a guy that hands you a cert", what does the certificate mean? I mean, besides "we had enough money to buy it"?

    Certificates are worthless if they don't certify anything but having enough money to have an auditor squat at your company for a few days. And if auditors are not liable for the validity of a cert, that's basically all they really prove. Why else should the auditor really audit a company and not just hang ou

    • Re: (Score:3, Interesting)

      by Aladrin ( 926209 )

      Exactly. A certificate -certifies- something. If it doesn't, it's not a certificate.

      The real question here is: What should happen to the certifier if their certificate proves false.

      I don't think this is a government question. If there's nothing in the contract about this scenario, then you paid for -nothing-. And if there is, you already know the solution to the problem... It's right in the contract.

      • If the cert was issued when the requirements were not met, the certifyer should be liable for it. Simple as that.

        I've been in more audits than I really wanted to stomach. Usually, they're a weak joke. The reason for this is very, very simple: Auditors want to audit other companies too, and companies talk with each other and "recommend" auditors based on their experience.

        Question for 500: What will get you recommended? A through audit, painstakingly dragging out every piece of possible dirt and repeating the

  • Plaintiff: "Your Honor, we're suing defendant because they certified our credit card system as being PCI compliant, yet it was breached by hackers."

    Judge to Defendant: "Is this true?"

    Defendant: "Yes Your Honor, however being PCI compliant does not guarantee you will never be breached by hackers."

    Judge: "Case dismissed!"

    • You beat me to it! The auditor only says your setup meets a specification. It does NOT guarantee that implementing the specification will keep you safe from all harm.

  • by luftmatraze ( 1567915 ) on Thursday June 04, 2009 @06:07AM (#28207359)
    I am working in a large firm. Quite often new projects upon realisation require technical audits as well as "Life Cycle" audits for existing systems involved with billing etc. One point that needs to be clear. Audits are not cheap! These guys are paid between 1500-2000 per Man day. Presently this is done in essence without ANY liability as to the quality of their work. What needs to be established in this case is: 1. Technical Audits provide a snapshot of a system "at a particular point in time" - Did at the time of the Audit these holes exist, or where there changes afterwards which could have affected the audit results? 2. Audit Scope. This is really important! If the Audit scope didn't include for instance the visibility of the systems from outside of the firewall, then the perspective of the auditors were limited and therefore the audit itself is not complete. I have seen companies for instance that are ISO 27001 Certified....however.... the audit scope was only for a particular part of the company. This enables the company to suggest 27001 Certification when in fact it may not indeed be fully the case. Most likely the outcome of such a case would be an increase in costs to cover Liability (insurance or something of the like) on the part of the auditor. However it may well be also an increase in the quality and transparency (clearer scope, limitations etc.) of technical audit work. Both of these are positive outcomes! http://streetstyles.ch/ [streetstyles.ch] - Swiss Band & Fashion Tshirts
  • As a 3rd party, auditors remarks and certifications can give a representation inducing someone to wrongfully contract. There is an established principle in the tort of negligence that allows an injured claimant to sue a non-contractual 3rd party in this type of misstatement. We call this the Hedley Bryne principle (google it).

    It's great. Providing you can establish a duty via special relation ship of reliance.

    It's bad for contracting parties however, if you imply that all contracts should have this principl

  • by DeanFox ( 729620 ) * <spam.myname@[ ]il.com ['gma' in gap]> on Thursday June 04, 2009 @07:50AM (#28208051)

    A Notary Public can be held responsible but an auditing firm isn't? I would have thought they already were held liable. If they're not, what a great job! Like a Notary Public that can stamp, validate and vouch for anything without cause for concern. It's probably because the Notary is people. The auditors are corporations. Corporations are just like people absent accountability or morals. Corporations are like Sociopaths. And as they're running the show, corporations are like Sociopaths in an Anarchy.

    -[d]-
  • It will just raise the prices since the auditors will have to take out insurance. Of course, what are banks to do? They hire someone to ensure they are compliant, then get screwed because they were compromised by something that is included in the compliancy!

    Security is an imperfect world. I'm sure something will come up that a company was compliant, but still gets compromised and attempts to sue the auditor. Then again, I've met more than a few auditors that had no busy being in the security business!

  • PCI in itself doesn't guarantee 100% safety.

    It says so right on their common myths page;
    https://www.pcisecuritystandards.org/pdfs/pciscc_ten_common_myths.pdf [pcisecuritystandards.org]
    Quote "Successful completion of a system scan or assesssment for PCI is but a snapshot in time.
    Security exploits are non-stop and get stronger every day, which is why PCI compliance efforts
    must be a continuous process of assessment and remediation to ensure safety of cardholder
    data."

    It's a very good PDF to read. Now if the auditor said they were PCI

  • I am an IT auditor working for a company that You would call if You would want to be certified.

    Certification means that there is a work (audit) programme that states control objectives. Auditor follows this programme very closely and then, if the issues are within some zone of tolerance (which may be zero as well), auditor writes a statement that company XYZ is compliant with this and that.

    What it does NOT mean is:
    a) a certified company will follow its practice after certification (they may just hav

    • by mkettler ( 6309 )

      I'd agree. Auditors are not a be-all end-all. It is not the auditors job to detect all unknowns about your company, they go on what you provide them, and the amount of digging they do beyond that is modest. Deep digging is the job of your staff, not auditors.

      However, I do think there should be liability for items clearly in-scope.

      For example, if I hire an auditor to audit me for PCI-DSS, I expect they will at least review all of the basic requirements of that standard. If an auditor blatantly neglects to in

  • by Klync ( 152475 )

    I'm surprised nobody mentioned this yet: adherence to PCI-DSS does not necessarily guarantee that your system cannot be cracked or broken into. PCI-DSS provides a set of guidelines - created by the banks and cc companies themselves - which must be met in order to be considered safe enough to be allowed to process transactions. Now, if the auditor was negligent or deceptive in certifying the system as compliant, this seems like a no-brainer lawsuit. However, it is entirely possible that the system *was* comp

/earth: file system full.

Working...