Forgot your password?
typodupeerror
Security

Following Best Coding Practices Doesn't Always Mean Better Security 61

Posted by samzenpus
from the keeping-your-guard-up dept.
wiredmikey writes "While some best practices such as software security training are effective in getting developers to write secure code, following best practices does not necessarily lead to better security, WhiteHat Security has found. Software security controls and best practices had some impact on the actual security of organizations, but not as much as one would expect, WhiteHat Security said in its Website Security Statistics Report. The report correlated vulnerability data from tens of thousands of Websites with the software development lifecycle (SDLC) activity data obtained via a survey. But there is good news — as organizations introduced best practices in secure software development, the average number of serious vulnerabilities found per Website declined dramatically over the past two years. 'Organizations need to understand how different parts of the SDLC affects how vulnerabilities are introduced during software development,' Jeremiah Grossman, co-founder and CTO of WhiteHat said. Interestingly, all the Websites tested under the study, 86 percent had at least one serious vulnerability exposed to attack every single day in 2012, and on average, resolving vulnerabilities took 193 days from the time an organization was first notified of the issue."
This discussion has been archived. No new comments can be posted.

Following Best Coding Practices Doesn't Always Mean Better Security

Comments Filter:
  • Following best architectural design practices doesn't mean your final building will be more secure.
    • No, no, no. Car analogy, puhlease!
    • Or to just cut past all the damned metaphors just because you teach your programmers not to have Bobby Drop Tables [xkcd.com] size screw ups doesn't mean there isn't a bazillion other ways they can screw up.
    • by chrismcb (983081) on Friday May 03, 2013 @07:07AM (#43618597) Homepage
      Of course it does, and TFA even says so. Of course going to "software security training" isn't exactly a "best practice" either. Just because someone goes to a training session doesn't mean they'll learn anything.
      Following best practices in code will most definitely make your code more secure. But more secure is very relative. You can still have plenty of vulnerabilities, just fewer than before.
      Best practices or no, we all make mistakes, and plenty of them. It only takes one mistake to leave a hole. Whether it is a simple bug, or a design flaw. And the bad guys only need to find one flaw. So yes Best Practices help. But it doesn't mean your code is going to be Fort Knox.
      • Now, Slashdot just took its title from the submitted article, so I'm not going to pin this one on the "editors". But TFA does not appear to provide any evidence to actually support the title. It implies that security doesn't improve as much as you'd expect, which in itself contradicts the claim; but what few examples are given are really of companies IGNORING best practices.

        Admittedly the submitted article is just a summary of a larger report, so it's possible the report itself provides some sort of support

      • I agree, and the reason for making mistakes have a lot to do with deadlines.
        Best practice or not, more time on designing, implementing and testing them can avoid a lot of mistakes. More often lack of communication and understanding to the code have a lot to do with testing (due to tight deadline). Both the technical testers (run static analysis report) and QA testers (testing for business cases) lacked the knowledge of the system, as they have no clue but to assume some of the things are working as expect

  • by bickerdyke (670000) on Friday May 03, 2013 @05:53AM (#43618349)

    Asking software companies if the require their developers to adhere to "best practice" won't lead to any usefull number at all.

    Or does anyone think anyone would admit to use only second-best programming standards?

    Let alone the question what programming techniques count as "best practice".

    • I don't know what 'best practices' are, but I can tell you, if it takes 193 days to fix a vulnerability, on average, from the time you are notified about it, then you are not following best practices.

      Or, if you are this guy, [google.com] and claim "bugs are not a big deal," then you are not following best practices.

      It's not clear what the author of the present study believes are best practices, but it's clear there are a lot of people who are doing things horribly wrong.
      • by plover (150551)

        You completely misread his essay. Yegge did not try to claim that "bugs are not a big deal" to him. He only pointed that out as a possible viewpoint of a "liberal" programmer.

        However, his thesis that programmers are either "conservatives" or "liberals" is completely wrong-headed. People are either engineers (tests, proof, knowledge) or not. Static typing doesn't enter into it. The crap he was yammering about seemed written to justify his choices and his existence, not to provide useful knowledge.

        • We can quote from what he wrote:

          "Bugs are not a big deal.......Bugs are not a big deal!...... (This belief really may be the key dividing philosophy between Conservative and Liberal philosophies.).....I am a hardcore software liberal, bordering on (but not quite) being a liberal extremist."

          He definitely puts himself on one side of the spectrum. As you said though, the whole essay could be renamed "Stuff I like and stuff I don't like." Because that's essentially what it is.

  • by Anonymous Coward on Friday May 03, 2013 @06:05AM (#43618375)

    If secure coding could be described by a few simple rules, coders would have already been replaced by programs.

  • Best practices (Score:5, Insightful)

    by philip.paradis (2580427) on Friday May 03, 2013 @06:08AM (#43618395)

    "Best practices" is such a wonderful term; its sheer flexibility permits the person invoking it to assure his audience that he meant exactly what he said, even when he didn't say much of substance.

    Now if you'll excuse me, I'm late for a game of bullshit bingo [bullshitbingo.net].

    • by SirGarlon (845873)
      "Best practices" is a synonym for groupthink [wikipedia.org].
    • by Anonymous Coward

      Are you sure you didn't mean Agile bingo [bullshitbingo.net]?
      Gotta keep it on topic!

    • by Anonymous Coward

      The best practice is to understand the options available and to choose the one that best fits your particular circumstances.

      But when developers ask "what is best practice" they want somebody else to make a choice that they can blindly apply without thinking. This is not the "best practice".

  • No surprise (Score:4, Insightful)

    by gweihir (88907) on Friday May 03, 2013 @06:11AM (#43618399)

    Secure code needs a holistic view. The usual component architecture that works pretty well for reliability and correctness, but not so well for performance, fails utterly for security. What is needed is a really competent secure software expert, that can do everything from architecture down to coding and is part of the team right from the beginning. This person must also have means to enforce security. Unfortunately, people that can do this job are very rare, and most projects do not have that role anyways.

    • Re: No surprise (Score:5, Insightful)

      by mattpalmer1086 (707360) on Friday May 03, 2013 @08:00AM (#43618803)

      Been there, done that, made redundant.

      This was at a software house selling payment processing middleware that had to be PA-DSS compliant. Achieved compliance, role made redundant.

      They clearly made a risk reward calculation and decided the benefit of securing the product was outweighed by the cost of slowing development. Particularly as everyone else's security also sucked and there was no particular liability for them if a breach occurred. It's a classic externality.

      I'm also on the steering committee for an initiative trying to improve software security and resilience. They also figured out that the market was failing here, and only legislation for software liability or some other mechanism to correct the externality had any chance of improving the situation. But the cure might be worse than the disease. ..

      • by gweihir (88907)

        Been there, done that, made redundant.

        This was at a software house selling payment processing middleware that had to be PA-DSS compliant. Achieved compliance, role made redundant.

        They clearly made a risk reward calculation and decided the benefit of securing the product was outweighed by the cost of slowing development. Particularly as everyone else's security also sucked and there was no particular liability for them if a breach occurred. It's a classic externality.

        Indeed. No liability always means no due diligence or care.

        I'm also on the steering committee for an initiative trying to improve software security and resilience. They also figured out that the market was failing here, and only legislation for software liability or some other mechanism to correct the externality had any chance of improving the situation. But the cure might be worse than the disease. ..

        I agree there as well. They will likely regulate how it has to be done (making it useless), instead of establishing liability. In other engineering fields it is quite simple: If your people were adequately qualified, careful and there was a competent independent review (if things are experimental) or bets practices were followed (if a standard problem), then it was an accident. Otherwise, negligence is found and the company pays. The negligence may

  • by cyborg_zx (893396) on Friday May 03, 2013 @06:30AM (#43618445)

    Since that's unlikely to change good ol' social engineering is still going to be the primary tool in any would be assailants toolbox for breaking security.

    • Reminds me of a question I was asked in an interview: "What do you consider the most serious threat to the integrity of data?"

      I looked at the person who asked the question and answered, "People on the inside." I then elaborated that the majority of data breaches and leaks comes from inside an organization, that once you allow someone access to that data, you have a potential security issue.

      That wasn't the answer they wanted.

  • Just a thought (Score:2, Interesting)

    by Anonymous Coward

    Isn't that why they're called Best Practices and not Perfect Practices?

  • Well.. 'Best coding practices' is all in the eye of the beholder.. what one calls best practice might look awfull to another.. there really is no 'best coding practices'..
    • Re: (Score:3, Interesting)

      Well.. 'Best coding practices' is all in the eye of the beholder.. what one calls best practice might look awfull to another.. there really is no 'best coding practices'..

      For overall coding, you're right - it's all in the eye of the beholder. For secure coding, one simple rule (which is unfortunately much harder to follow than it should be) will avoid 99% of the problems:

      DON'T EXECUTE CODE WRITTEN BY YOUR USERS!

      What makes it so damn hard is the temptation (if not active encouragement by your platform) to "stringly type" all your data, combined with the temptation (if not active encouragement by your platform) to build up executable code by pasting strings together, al

    • This is why I write all of my code on one line. It's clean and simple. Others brag about 1000 lines of code; I brag about 1 line with 20k columns. Got an exception? Yeah, it's on line one. Got a compile error? Whatdoyouknow, it's on line one also. I just need a better text editor that can handle columns better...
  • by Anonymous Coward

    is more important than having managers that insist that engineers follow every guideline from the MS Press book, or whatever.

    For example, one of the guidelines is always "do not use sprintf". But sprintf is perfectly safe in cases like this:


    std::string myfunc( int i ) {
    char buffer[80];
    sprintf( buffer, "Your number=%d", i );
    return buffer;
    }

    So what we sometimes see is a lot of mindless replacements of perfectly good function calls with slower, more difficult-to-read counterparts, where the process of substitu

    • Re: (Score:3, Interesting)

      by Anonymous Coward

      sprintf is a minefield of bad. You *have* to know how to use it correctly.

      For example

      char xyz1 =1;
      unsigned int xyz2 = 2;
      long long xyz3 = 3;
      short xyz4 = 4;
      char buffer[50];
      sprintf(buffer, "%d %d %d %d", xyz1, xyz2, xyz3, xyz4);

      That is a bad statement (especially if you are porting between platforms). with at least 6 different places for over runs and underruns. 1 place for a incorrect signed type. Your code btw returns a pointer from the stack. Which means it will just 'go away' and is subject to change.

    • But sprintf is perfectly safe in cases like this:

      If we make some (currently sensible) assumptions about sizeof(int), yes, that's safe. The problem is that somewhere down the road, the newbie who maintains your code is going to change it to

      sprintf( buffer, "After long and careful consideration, our system has calculated your number=%d", i );

      Secure code is not just "code containing no flaws", it's "code structured in such a way to guard against the introduction of flaws in further development and maint

  • is that many 'best practices' are so undocumented and secretive that once you do find out about them as a security lead, it just becomes a ball of christmas lights to figure out the other moronic and insecure crap programmers are wholly convinced is pure and righteous. For example: string sanitization is a pretty wild goose chase.
  • by SirGarlon (845873) on Friday May 03, 2013 @09:53AM (#43619503)
    I started reading the report and I quit halfway through the executive summary. This is one of those reports that says, "We documented a bunch of stuff happening. No idea why it happened, but let's speculate." I generally respect the folks at White Hat (have met several at conferences etc.) but I simply don't see the value in this report. I think they've lost track of why it's worthwhile to conduct a study in the first place. Perhaps Richard Feynman [lhup.edu] can help.
  • Probably because everyone I've ever heard use the term "best practices" has been a useless corporate suit with no clue how to craft software. The people who know their shit simply say that this part is good or this part is bloody fucking stupid.

    "Best Practices" is one of those corporate buzzwords that offers the veneer of competence that suits use as a tool when talking to their customers or their own bosses. Do your best to stay away. Those guys are poison and will assuredly steal any thunder you try and m

    • by BVis (267028)

      Then there are the guys that equate "best practices" as "the lazy programmers trying to get more time to do something that we've already decided should take X." I consider testing/QA as an example of "best practices". At a previous job, we were in a staff meeting talking about the progress of a piece of software we were writing. The bonehead VP of Marketing asked our QA guy how long he would need to finish a testing project. The QA guy answered "6 weeks". The VP then said "You have 3. QA always takes

  • You could use passwords best practices (long, upper/lowercase+symbols+digits) so have your name and address as password, but that is not safe, no matter what password strenght checking program tells you.
  • Coding practices help you write nice, readable and maintainable code, with clean code you can work on securing it. Security in code is entirely a career in it's own right, it depends on the platform, the language, the protocols and more. So the first step is to make the code clean and maintainable, then you secure it, but don't group both together, you need the abstraction layer.
  • "best" practices. Stop reading Gartner.

  • It could also just be argued that the next gen of studios for languages have more robust debugging and error checking and even documentation on such issues. You might get the next version of visual studio that runs the integrity test or acid test to see what an app or website is lacking for security. The built in unit testing modules can be used this way too.... it all depends on how much time can be given to secure the apps or just write the code and meet the deadline. The most secure ISO certified code (s

"Trust me. I know what I'm doing." -- Sledge Hammer

Working...