Following Best Coding Practices Doesn't Always Mean Better Security 61
wiredmikey writes "While some best practices such as software security training are effective in getting developers to write secure code, following best practices does not necessarily lead to better security, WhiteHat Security has found. Software security controls and best practices had some impact on the actual security of organizations, but not as much as one would expect, WhiteHat Security said in its Website Security Statistics Report. The report correlated vulnerability data from tens of thousands of Websites with the software development lifecycle (SDLC) activity data obtained via a survey. But there is good news — as organizations introduced best practices in secure software development, the average number of serious vulnerabilities found per Website declined dramatically over the past two years. 'Organizations need to understand how different parts of the SDLC affects how vulnerabilities are introduced during software development,' Jeremiah Grossman, co-founder and CTO of WhiteHat said. Interestingly, all the Websites tested under the study, 86 percent had at least one serious vulnerability exposed to attack every single day in 2012, and on average, resolving vulnerabilities took 193 days from the time an organization was first notified of the issue."
Re: (Score:2, Troll)
Following best architectural design practices (Score:3)
Re: (Score:2)
Re: (Score:2)
Comment removed (Score:5, Informative)
Re:Following best architectural design practices (Score:5, Insightful)
Following best practices in code will most definitely make your code more secure. But more secure is very relative. You can still have plenty of vulnerabilities, just fewer than before.
Best practices or no, we all make mistakes, and plenty of them. It only takes one mistake to leave a hole. Whether it is a simple bug, or a design flaw. And the bad guys only need to find one flaw. So yes Best Practices help. But it doesn't mean your code is going to be Fort Knox.
Problem is, TFA doesn't support the title (Score:2)
Now, Slashdot just took its title from the submitted article, so I'm not going to pin this one on the "editors". But TFA does not appear to provide any evidence to actually support the title. It implies that security doesn't improve as much as you'd expect, which in itself contradicts the claim; but what few examples are given are really of companies IGNORING best practices.
Admittedly the submitted article is just a summary of a larger report, so it's possible the report itself provides some sort of support
Re: (Score:1)
I agree, and the reason for making mistakes have a lot to do with deadlines.
Best practice or not, more time on designing, implementing and testing them can avoid a lot of mistakes. More often lack of communication and understanding to the code have a lot to do with testing (due to tight deadline). Both the technical testers (run static analysis report) and QA testers (testing for business cases) lacked the knowledge of the system, as they have no clue but to assume some of the things are working as expect
"best practice" and surveys (Score:5, Insightful)
Asking software companies if the require their developers to adhere to "best practice" won't lead to any usefull number at all.
Or does anyone think anyone would admit to use only second-best programming standards?
Let alone the question what programming techniques count as "best practice".
Re: (Score:2)
Or, if you are this guy, [google.com] and claim "bugs are not a big deal," then you are not following best practices.
It's not clear what the author of the present study believes are best practices, but it's clear there are a lot of people who are doing things horribly wrong.
Re: (Score:3)
You completely misread his essay. Yegge did not try to claim that "bugs are not a big deal" to him. He only pointed that out as a possible viewpoint of a "liberal" programmer.
However, his thesis that programmers are either "conservatives" or "liberals" is completely wrong-headed. People are either engineers (tests, proof, knowledge) or not. Static typing doesn't enter into it. The crap he was yammering about seemed written to justify his choices and his existence, not to provide useful knowledge.
Re: (Score:3)
"Bugs are not a big deal.......Bugs are not a big deal!...... (This belief really may be the key dividing philosophy between Conservative and Liberal philosophies.).....I am a hardcore software liberal, bordering on (but not quite) being a liberal extremist."
He definitely puts himself on one side of the spectrum. As you said though, the whole essay could be renamed "Stuff I like and stuff I don't like." Because that's essentially what it is.
Rules should be treated like guidelines (Score:5, Insightful)
If secure coding could be described by a few simple rules, coders would have already been replaced by programs.
Actually, that was the result ... programs suck. (Score:2)
My take-away from the article was that training people about security issues worked, relying on application firewalls & automatic code review made things worse.
People: 1 ; Programs : -2. (or people 3, programs 0, depending on how you want to count)
Best practices (Score:5, Insightful)
"Best practices" is such a wonderful term; its sheer flexibility permits the person invoking it to assure his audience that he meant exactly what he said, even when he didn't say much of substance.
Now if you'll excuse me, I'm late for a game of bullshit bingo [bullshitbingo.net].
Re: (Score:2)
Re: (Score:1)
Are you sure you didn't mean Agile bingo [bullshitbingo.net]?
Gotta keep it on topic!
Re: (Score:1)
The best practice is to understand the options available and to choose the one that best fits your particular circumstances.
But when developers ask "what is best practice" they want somebody else to make a choice that they can blindly apply without thinking. This is not the "best practice".
No surprise (Score:4, Insightful)
Secure code needs a holistic view. The usual component architecture that works pretty well for reliability and correctness, but not so well for performance, fails utterly for security. What is needed is a really competent secure software expert, that can do everything from architecture down to coding and is part of the team right from the beginning. This person must also have means to enforce security. Unfortunately, people that can do this job are very rare, and most projects do not have that role anyways.
Re: No surprise (Score:5, Insightful)
Been there, done that, made redundant.
This was at a software house selling payment processing middleware that had to be PA-DSS compliant. Achieved compliance, role made redundant.
They clearly made a risk reward calculation and decided the benefit of securing the product was outweighed by the cost of slowing development. Particularly as everyone else's security also sucked and there was no particular liability for them if a breach occurred. It's a classic externality.
I'm also on the steering committee for an initiative trying to improve software security and resilience. They also figured out that the market was failing here, and only legislation for software liability or some other mechanism to correct the externality had any chance of improving the situation. But the cure might be worse than the disease. ..
Re: (Score:1)
Regulatory compliance will only read to clusterfuck.
You're doing it wrong. (Err... well, you're probably right that Congress would do it wrong, too.) A good regulation for handling the GP's problem wouldn't be "use exactly these security processes", it would be "you are liable for security problems in software you sell (and maybe even are required to have insurance for it)". The problem is externalities; good regulations should internalize those externalities and do nothing else.
Re: (Score:2)
The problem is that governments have become so incompetent these days that nobody knows how to create good regulations anymore.
Re: (Score:2)
Been there, done that, made redundant.
This was at a software house selling payment processing middleware that had to be PA-DSS compliant. Achieved compliance, role made redundant.
They clearly made a risk reward calculation and decided the benefit of securing the product was outweighed by the cost of slowing development. Particularly as everyone else's security also sucked and there was no particular liability for them if a breach occurred. It's a classic externality.
Indeed. No liability always means no due diligence or care.
I'm also on the steering committee for an initiative trying to improve software security and resilience. They also figured out that the market was failing here, and only legislation for software liability or some other mechanism to correct the externality had any chance of improving the situation. But the cure might be worse than the disease. ..
I agree there as well. They will likely regulate how it has to be done (making it useless), instead of establishing liability. In other engineering fields it is quite simple: If your people were adequately qualified, careful and there was a competent independent review (if things are experimental) or bets practices were followed (if a standard problem), then it was an accident. Otherwise, negligence is found and the company pays. The negligence may
People are the primary security hotspots (Score:3)
Since that's unlikely to change good ol' social engineering is still going to be the primary tool in any would be assailants toolbox for breaking security.
Re: (Score:3)
Reminds me of a question I was asked in an interview: "What do you consider the most serious threat to the integrity of data?"
I looked at the person who asked the question and answered, "People on the inside." I then elaborated that the majority of data breaches and leaks comes from inside an organization, that once you allow someone access to that data, you have a potential security issue.
That wasn't the answer they wanted.
Just a thought (Score:2, Interesting)
Isn't that why they're called Best Practices and not Perfect Practices?
In the eye of the beholder (Score:2)
Re: (Score:3, Interesting)
Well.. 'Best coding practices' is all in the eye of the beholder.. what one calls best practice might look awfull to another.. there really is no 'best coding practices'..
For overall coding, you're right - it's all in the eye of the beholder. For secure coding, one simple rule (which is unfortunately much harder to follow than it should be) will avoid 99% of the problems:
DON'T EXECUTE CODE WRITTEN BY YOUR USERS!
What makes it so damn hard is the temptation (if not active encouragement by your platform) to "stringly type" all your data, combined with the temptation (if not active encouragement by your platform) to build up executable code by pasting strings together, al
Re: (Score:2)
Re: (Score:2)
Having good engineers (Score:2, Insightful)
is more important than having managers that insist that engineers follow every guideline from the MS Press book, or whatever.
For example, one of the guidelines is always "do not use sprintf". But sprintf is perfectly safe in cases like this:
std::string myfunc( int i ) {
char buffer[80];
sprintf( buffer, "Your number=%d", i );
return buffer;
}
So what we sometimes see is a lot of mindless replacements of perfectly good function calls with slower, more difficult-to-read counterparts, where the process of substitu
Re: (Score:3, Interesting)
sprintf is a minefield of bad. You *have* to know how to use it correctly.
For example
char xyz1 =1;
unsigned int xyz2 = 2;
long long xyz3 = 3;
short xyz4 = 4;
char buffer[50];
sprintf(buffer, "%d %d %d %d", xyz1, xyz2, xyz3, xyz4);
That is a bad statement (especially if you are porting between platforms). with at least 6 different places for over runs and underruns. 1 place for a incorrect signed type. Your code btw returns a pointer from the stack. Which means it will just 'go away' and is subject to change.
Re: (Score:2)
If we make some (currently sensible) assumptions about sizeof(int), yes, that's safe. The problem is that somewhere down the road, the newbie who maintains your code is going to change it to
sprintf( buffer, "After long and careful consideration, our system has calculated your number=%d", i );
Secure code is not just "code containing no flaws", it's "code structured in such a way to guard against the introduction of flaws in further development and maint
Re: (Score:2)
Good studies and bad studies (Score:4, Interesting)
Probably because (Score:2)
Probably because everyone I've ever heard use the term "best practices" has been a useless corporate suit with no clue how to craft software. The people who know their shit simply say that this part is good or this part is bloody fucking stupid.
"Best Practices" is one of those corporate buzzwords that offers the veneer of competence that suits use as a tool when talking to their customers or their own bosses. Do your best to stay away. Those guys are poison and will assuredly steal any thunder you try and m
Re: (Score:2)
Then there are the guys that equate "best practices" as "the lazy programmers trying to get more time to do something that we've already decided should take X." I consider testing/QA as an example of "best practices". At a previous job, we were in a staff meeting talking about the progress of a piece of software we were writing. The bonehead VP of Marketing asked our QA guy how long he would need to finish a testing project. The QA guy answered "6 weeks". The VP then said "You have 3. QA always takes
Passwords (Score:2)
Practice != Security (Score:1)
Then they obviously aren't... (Score:2)
"best" practices. Stop reading Gartner.
It could also be arugued... (Score:2)
It could also just be argued that the next gen of studios for languages have more robust debugging and error checking and even documentation on such issues. You might get the next version of visual studio that runs the integrity test or acid test to see what an app or website is lacking for security. The built in unit testing modules can be used this way too.... it all depends on how much time can be given to secure the apps or just write the code and meet the deadline. The most secure ISO certified code (s