Defending Open Source Security 260
dpilgrim writes "DevX's A. Russell Jones as thrown down the gauntlet, questioning the security of Open Source software. I've picked up the gauntlet and
posted a response over on the O'Reilly Network. As previously
discussed on /. Jones' comments are too controversial to ignore."
Too controversial to ignore? (Score:5, Interesting)
Opions are valued in inverse relation to the amount of money paid to produce them.
In this case, the opinion that transparency is bad for security is of so little value that it's difficult to answer it with a serious tone.
After all, Windows is remarkable for its security wrt to something like, OpenBSD, known for its secretive and opaque practices.
lol.
Obvious chance to find out... (Score:5, Interesting)
Anyone want to bet that the number of exploited Windows security holes is NOT gonna soar?
Having the source may help bad guys ... (Score:5, Interesting)
The real problem would be if only bad guys had your source code
Yeah, that would suck. That would really suck.
--
Go Debian!!!
Re:Huh? (Score:3, Interesting)
He might be right... (Score:2, Interesting)
On the other hand, if he means code that's been built openly... damn, what's better than having the software AND the source code for inspection? how do you beat that?
Proprietary code does not prevent hacked binaries. (Score:5, Interesting)
At any one of those stages, a hacked binary could've been introduced into the operating system. To modify a binary, even without access to the source code for said binary, is a trivial task for anyone with a rudimentary knowledge of assembler.
Proprietary code does not, in any way, prevent malicious code from entering the system. One of the points in the original article was that a malicious distribution could be specifically tailored for and marketed to, for instance, a government. My example above shows how a proprietary code operating system can be used in a similar way, and this time without any source code to check against.
Re:Best point is the last (Score:5, Interesting)
Feeding trolls... (Score:4, Interesting)
Bottom line for me is that FUD is FUD is FUD is FUD. There are several ways to combat it and one of them is to just let those that want to FUD away while we continue to build, create, use, and accept that OSS is a good thing for everyone. Those with small minds are scared, good. I don't want those people involved with me and it makes me actually feel good when I see that they have to resort to such lies and FUD to try to defend what they see as "the only way".
I read a comment here the other day about how someone viewed OSS OSes as the ultimate capitalist leveling field. By making not only the hardware but the base software, the OS, open you then allow everyone to create things as they wish and without any strings. They even can make them closed source if they so wish but the hooks, protocals, and standards are open such that you can make the software work correctly, regardless of platform.
As has been sited here many times MS has not even given that freedom to it's programmers with it's lack of API documentation in addition to it's lack of standards (Unless you think that they are alone in being able to set them. Go away then you shrill.) and numerous changes in even their own types of file standards. (Why does MS Word docs have to change so often? Hello, forced upgrades.)
I really could care less about such FUD from some lame ass website that I personally have never visisted or even heard of until reading the inital
"Many Eyes" never actually proven to work (Score:0, Interesting)
Linux security site abandoned
Is Linux security good enough or does no-one actually care?
http://www.techworld.com/news/index.cfm?fuseact
It seemed like a good idea at the time. Set up a website that allows users and developers alike to check which pieces of Linux code have been checked for security holes. The project, dubbed Sardonix, was a classic open source solution to a clear problem.
The scheme's originator Crispin Cowan, chief research scientist at WireX Communications, said: "Auditing is needed not just because some developers refuse to read, or follow such standards, but also because humans make mistakes and may fail to completely, or correctly, follow all rules perfectly."
Yet few became involved because, according to Cowan, there's no glory in auditing security holes.
Funded initially by the US defence establishment body Defense Advanced Research Projects Agency (DARPA), the research grant aiming to centralise what was, and remains, a fairly loosely structured review process dried up nine months ago.
The plan was that volunteer code auditors would be ranked according to the volume of code they examined and the number of security holes discovered. Points would be lost if holes were subsequently discovered in code passed as clean.
But, said Cowan, "I got a great deal of participation from people who had opinions on how the rankings should work, and then squat from anybody actually reviewing code."
Cowan added: "The Bugtraq model is: find a bug, win a prize - a modest amount of fame," says Cowen. "Our model is: review a whole body of code, eventually finding no bugs, and receive a deeper level of appreciation from people who use the code. It seems the Sardonix lesson is people don't want to play this game, they want to play the Bugtraq game."
Some have commented that few people can both code and have sufficient expertise to spot buried security bugs for no reward, while others moot a lack of visibility and marketing as the reason for the site's demise.
Only 22 pieces of code are listed on the site as having been audited, 14 as unaudited.
Re:Laughable assertions (Score:5, Interesting)
The impression I formed from the DevX article was that it was aimed at government (and I suppose you could article that that might influence large corporations, too).
In my experience government and corporate IT admins are *not* trusting souls. As an example, I once worked as a contractor for an agency that built software for the UK health service: everything I built was then reviewed and recompiled by in-house staff. The manager told me that they preferred open-source precisely because of the ability to review source code. Cost was only a secondary factor.
The same manager also commented that security-through-obscurity - relying on closed-source to deter evil-doers - was not an acceptable option as it placed to much reliance on third-parties.
Re:Obvious chance to find out... (Score:5, Interesting)
To tell you the truth, I am not interested. Why should I look at parts of a badly structured, feature infested, bug infested monolith of an OS? When I can at the same time find out how to do it right by looking at the sources of the Linux kernel or one of the open sourced BSD's? Why would I actually want to read bad code?
True, some people will actually spend the time to find vulnerabilities. Some of them (especially those in military and commercial espionage) will not publish what they find. But I suspect these people already had this kind of access before. And the usual script-baby loosers do not have the competences to understand the sources anyway.
One thing could happen though: Too many published and still current vulnerabilities for MS to fix. Or even worse, vulnerabilities they cannot fix because they made bad design decisions. Will be interesting to watch.
Microsoft Isn't Closed Source (as such) any more (Score:5, Interesting)
When was the last time you downloaded binaries... (Score:1, Interesting)
Follow the $ (Score:2, Interesting)
So much for objectivity.
Re:Laughable assertions (Score:4, Interesting)
Sure, you've found a patch of very trusting FreeBSD users. However, I'll bet that this one stupid windows game is downloaded and run with full privleges with no safety checks at all by a hundred times as many people.
as a gov contractor... (Score:1, Interesting)
Matching Source and Binary in Commercial Distros (Score:1, Interesting)
When running commercial distros, I've never been sure that the source I have actually matches the (precompiled) binaries that the distro provides. In more than one case, I've found that patches that have obviously been applied to the kernel I'm running aren't in the source provided with the distro.
This doesn't inspire confidence.
The solution, of course, is to throw out the commercial providers altogether and compile everything from inspected source stored in a secured repository. This isn't something a small company could do, but would be eminently practical for a large enough organization or a government.
Ad hominem (Score:3, Interesting)
The accusation of bias at the end does open source no credit; someone writing for O'Reilly could be accused of bias as easily as someone writing for DevX. Stone would have done better to leave that out, and read one of the advocacy FAQs instead. DevX itself hosts a better rebuttal [devx.com] than his.
Preventing copyright violation claims against OSS (Score:5, Interesting)
How can this be ensured and how can it be easily shown in a court of law that this community takes copyright issues seriously? One way that I see is to set up a server that runs the comparator [catb.org] by ESR against any new submission to any open source project against any code released either by mistake on with malice by a closed source vendor.
This will help to identify copyright problems before they arise. Of course to have a proprietary source code base on this server would probably be illegal in itself but it is unnecessary to have the proprietary source code, all that is needed is a set of hash-keys that identify that source code.
How could this work? A copyright protection server (CPS) would have hash-keys supplied by different vendors of software that falls into various categories and the free software projects are also divided into these categories. Let's say there is a free software project that deals with image manipulations. The CPS would run a hash-key generator on the new code submission and then would compare the generated keys with the keys supplied by Adobe or other companies specialized in image manipulations. Of-course the closed source companies would have to run the hash-key generators on their code and supply their keys, and someone has to tell them to do that, but if it is done right then the following would happen:
1. The Free Software community would have better protection from inappropriate code submissions.
2. This can be publicised and shown that the Free Software community takes their work seriously and goes to the great length, much more than any corporations to make sure that their code is Free and free of inappropriate submissions.
3. In a court of law this can be very useful, it shows good faith on the part of the free software community.
4. This would make it easier to also figure out whether the closed source vendors are misusing GPLed software
5. This makes a nice project that can be commercialized (with all the lates IP propaganda and lawsuites.)
6. This hopefully will prevent many possible infringement claims.
Well, this is just a thought, but I think this kind of verification will become part of reality at some point in the future, given more lawsuites.
Any thoughts, comments, suggestions, ideas?
OSS development isn't driven by profit. (Score:3, Interesting)
How about a company [thescogroup.com] thats taken a new and possibly bad direction because one of the executives or a newly appointed CEO [thescogroup.com] wants to impress shareholders [sco.com] and make money for themselves?
Point being, OSS projects are typically written on a timeline based on one requirement, is the project ready for the release?
It has always been my opinion that publicly traded companies are ruined by their shareholders.
Missing the point again (Score:2, Interesting)
Yes, it's true that closed, proprietary software can have malicious code introduced into them just as well as free software. But part of the original argument is that the barrier to entry to creating your own distribution of project X is extrememly low, probably even close to zero (the author never said this explicitly, but I think it was implied). So while, yes, closed systems could get infected, too, there is an underlying assumption that proprietary software has stricter screening of its employees for just such a reason. There is no screening in free software; it's basically a free-for-all.
Also, I see a lot of responses saying varying degrees of "geez, they can just verify their binaries/source trees!". Well, once again, this is the classic Linux naivete of assuming too much on the part of the user. Sure, if we're talking about highly sensitive software then there will presumably be some auditing mechanism to make sure the software is legit. However, to assume that everyone has ready access to intelligent programmers to verify all their computer purchasing decisions is rather absurd, especially in the lower levels of government.
In short, I didn't think the response was really responding to the argument at all. Of course closed software can have the same backdoors! But did the author even stop to ponder, "Hey, I wonder why he might have singled out free software as being more vulnerable? Hmmm, no reason I can think of!"
Response misses the point (Score:3, Interesting)