Is Open Source Fertile Ground for Foul Play? 723
jsrjsr writes "In an article DevX.com entitled Open Source Is Fertile Ground for Foul Play, W. Russell Jones argues that open source software is bad stuff. He argues that open source software, because of its very openness, will inevitably lead to security concerns. He says that this makes adoption of open source software by governments particularly worrisome. In his words: 'An old adage that governments would be well-served to heed is: You get what you pay for. When you rely on free or low-cost products, you often get the shaft, and that, in my opinion, is exactly what governments are on track to get.'"
Sounds like someone trying to by controversial... (Score:5, Insightful)
"Unfortunately, the model breaks down as soon as the core group involved in a project or distribution decides to corrupt the source, because they simply won't make the corrupted version public."
And of course there just CAN'T be any guard against the actual program being implemented differing from the publicly available source...
"I'm not naive enough to think that proprietary commercial operating system software doesn't have the same sort of vulnerability, but the barriers to implementing them are much higher, because the source is better protected."
And when those holes are discovered, they aren't published at all. And the proprietary owner has a far more difficult time finding these existing holes themselves. And most of all, there's NOTHING STOPPING THE PROPRIETARY OWNER from implementing this same type of worst-case scenario the author of this piece describes, and an even smaller chance of discovery by outsiders. Sheesh.
Russell seems a bit dated (Score:5, Insightful)
Seems like W. Russell Jones is trying to apply 1900-era economics to a collaborative, abstract, not-truly-market-driven, positive-feedback context.
There might be security concerns with Open Source (he, most interestingly, doesn't go into security concerns with closed source or compare track-records); however, Russell is trying to pull a fast one as this is a different (and, I'd argue, wrongful) criticism of OS.
RD
hrm... (Score:3, Insightful)
What a sellout (Score:5, Insightful)
"Anyone who cares to join" (Score:5, Insightful)
Bosh. Open source project leaders - especially the leaders of popular projects - don't let just anyone have write access. Also, commits almost always go to a mailing list to be reviewed by the other committers and lurkers.
And of course, there's no way a commercial product could be infiltrated by someone who wants to inject harmful code. Impossible!
Microsoft irony is not lost (Score:5, Insightful)
Closed source is fertile ground for foul play (Score:5, Insightful)
Fear Outlook Express for Linux... (Score:5, Insightful)
However, with that, some of the inherent security of Linux fails. Imagine an e-mail client that will execute a binary attachment with no questions asked because the user double-clicked on the pretty icon. That's how MyDoom spread on Windows, and basically, it's the fact that the current setup for Linux makes it hard to execute something new that makes people realize what they have before they run it...
As soon as we have pretty looking greeting card executables that run on Linux, the downfall will be what comes next...
Um, yeah (Score:5, Insightful)
I mean, there is a whole friggin lot of open-source out there, there's bound to be a few examples of the problem? Right? Right???
My God! (Score:5, Insightful)
Open source advocates rightfully maintain that the sheer number of eyes looking at the source tends to rapidly find and repair problems as well as inefficiencies--and that those same eyes would find and repair maliciously inserted code as well. Unfortunately, the model breaks down as soon as the core group involved in a project or distribution decides to corrupt the source, because they simply won't make the corrupted version public.
I mean, this can't actually be an argument that closed developed by a "core group" that "won't make the corrupted version public" is more trustworthy than open development where anyone can see the code. Right? Right?
Who's paying DevX to write this shit? (Score:5, Insightful)
During a week when Microsoft admits it sat on the worst flaw ever for 6 months, and MyDoom and friends are rampaging around it's shameful to see an article written with so much fear and so little substance. He even manages to say that OSS might be used by terrorists against the US (although he doesn't use the word).
An absolutely disgusting piece of "journalism".
John.
Not as much of a differences. (Score:4, Insightful)
In open source software, the maintainers vet patches by peer review before admitting them into the main product line. Likewise, closed source products are peer reviewed, but by a much smaller team, who probably have much more similar agendas than people flung across the globe. Either could be compromised. This exact same article could have been entitled "Software Is Fertile Ground for Foul Play". The concern that backdoors exist is the reason Asian countries have been suspicious of Microsoft's closed source software. To assuage those fears, Microsoft provided the source code for review. If this review is successful in showing that no backdoors exist (and I have no idea how they can tell that some unobtrusive code isn't deliberately flawed) then surely open source can be equally reviewed, if not suffer a more stringent review by opening the question to the open source community within the country in question.
The security that closed source promises by "protecting the source" is security through a promise by a potentially hostile vendor. The security open source promises is the vigilance of those who review the code. I don't see how one is better than the other, but I surely don't see how closed source is going to make a potential target feel better than if they could review the source.
Beware the Luddites! (Score:5, Insightful)
One, Microsoft software, the most popular "closed source" software in the world, is rife with security holes. While the most popular (arguably) open-source software in the world, Apache, doesn't strike me as being terribly buggy *or* full of security holes. For instance, I don't have to update my apache software once a week.
Two, often for popular open-source products there is plenty of free and timely support. Advantage is also to the qualified technophile, who can support his or her own software, and not rely on the timetables of vendors.
Three, accoutability. What has Microsoft *ever* been accountable for? Viruses? Bugs? Data loss?
It's like Fred Moody all over again (Score:5, Insightful)
Can we please stop letting people use slashdot to increase the hit rate on their articles in order to make themselves seem relevant to their bosses?
Fred moody, the infamous anti-Linux ABC News columnist, was doing the exact same thing [linuxtoday.com] four years ago. In fact, he was writing on pretty much the same subject, that Open Source is insecure and untrustworthy by its very nature.
Those who do not study history are doomed to repost it.
Re:Sounds like someone trying to by controversial. (Score:5, Insightful)
Re:Sounds like someone trying to by controversial. (Score:5, Insightful)
Who needs proof when you have FUD? See also SCO.
flight simulator in excell (Score:2, Insightful)
If closed source is so safe, how could this have happened?
Further, if that happened, how do you know that other more dangerous items haven't also been included in the windows products??
I can poke some big holes in this argument... (Score:4, Insightful)
*Deletes 40 zillionth mydoom attachment in his inbox*, and I suppose other operating systems are more secure...what exactly are you suggesting we do about the lack of security in today's OS's? Linux, Windows, Unix even have all identified security flaws in their time...
What can we trust in code? You mention it right there Mr. Author, we can trust the latest and greatest stable Linux kernels, but if install a test kernel, or some hobbyist lil' app on the remote corners of the open source world on a production server, you get what you deserve. Incidentally the same goes for windows, WinXP latest Service pack is definitely more secure than any test versions of their OS's, or even the initial RTM builds of their operating systems. What gets deployed in a production environment...well duh....
The author says:
[Snip] Worse though, I don't think that security testing can be made robust enough to protect against someone injecting dangerous code into the software from the inside--and inside, for open source, means anyone who cares to join the project or create their own distribution.
I suppose we trust Microsoft, SCO and IBM more? Puh-leez, if you need a totally secure OS, you're best off hiring your own programmers and starting from scratch, and hoping they're as secure as anyone else, oh wait can't trust them either...never mind just build an OS yourself then...
Ok I'm done ranting, everyone else's turn
Re:Sounds like someone trying to by controversial. (Score:5, Insightful)
The author completely ignores the storied history of exactly this kind of thing in closed source software -- only these backdoors are called 'features' or 'easter eggs.'
We need a new term for this kind of journalistic troll.
-- Cheers,
-- RLJ
Re:Russell seems a bit dated (Score:5, Insightful)
Let's also apply his adage to his opinions (Score:1, Insightful)
His points are valid (Score:5, Insightful)
Open source development is not truly open to everybody; it is normally open to everyone who you allow to contribute code to your project. They've normally proved themselves by offering bug fixes and mionor changes directly to you beforehand.
The barriers to inserting malicious code in closed source are lower, not higher. Many an engineer has inserted a backdoor in his code which he surrepticiously used to help customers who lose passwords or setup info. However, a backdoor is just another way for a cracker to break into the system. Also bored engineers often leave Easter eggs in their closed source, something hard to do when several thousand people may review your code to see what makes it tick. In mainstream projects like Linux kernel, the bar to being allowed to contribute code is quite high, and your initial attempts are likely to be looked on with scorn by other project members.
As for costing huge amounts of money, one wonders what cost MyDoom has been costing owners of that wonderful example of closed source software - Windows.
Re:Ahhh.. (Score:1, Insightful)
Re:What a sellout (Score:3, Insightful)
I want whoever controls my nuclear arsenal to have the source and expertise to the software they use, so that they can fix it themselves. I'm almost certain that the military and org. like NASA get the source to the software they use. And then the question becomes, how is that not open source?
Quis custodiet ipsos fosses? (Score:3, Insightful)
Where exactly is the logic in this? In the open source world, at least there are "watchers", and you have the ability to "watch" yourself, or at least pay someone to review the code for you if you don't have the abilty. This isn't the case with almost all commercial software. This reeks of FUD and is poorly written.
You get what you pay for (Score:3, Insightful)
Just because Microsoft gouges you $X to do that copying doesn't mean that the bits are of any greater quality; Microsoft has poured loads of cash into developing its products, and the Free Software / Open Source folks have poured loads of volunteer time (and sometimes, cash) into developing their software. You might look at the amount of effort that has gone into creating each, and then try to apply the get-what-you-pay-for adage to that, but applying it to the price of the box on the shelf is ludicrous.
A. Russel Jones Background (Score:5, Insightful)
Visual Basic book, asp.net in C# book... looks like Mr.Jones is up to his ears in non-open source work. I hate having someone that has no background in something condeming it.
Its like someone who is an ASP developer condeming Java before even coding a lick of it.
Almost speechless. (Score:3, Insightful)
Having read the full article, I have to say that this is one of the most annoying pieces of writing I've read in quite a while. The author of this paper is assuming some naive elitist position in a fantasy world where corporate interests can never be anti-government and where code produced by the masses is somehow 'dangerous' because it might be exploitable.
As several other comments have pointed out, there is absolutely nothing to the "foul play" argument presented in this article that could not also apply to a closed-source project. In my opinion, the major difference is that the closed-source project's flaws [and note that in this article the author is talking about deliberately introduced flaws - basically the idea that OSS projects might be converted into trojan horses], if they exist, might never be discovered at all. If I buy a copy of Windows, I have absolutely no clue whether or not any such flaws exist, but more importantly, I have no way to check because I can not examine the source code. At least with open source software, if I suspect misuse or even if I'm only paranoid, I can examine the source code myself or have someone knowledgable [whom I trust] do it for me
Overall, this seems to be a pretty blind and poorly thought-out attack. A pity that editors aren't more carefully edited. :-P
Oh really? (Score:5, Insightful)
That said, the best setup for the government is to use 3-4 platforms in each agency. MacOS X on the average desktop. Linux on the many of the servers. Windows on some print and file servers. Maybe some Sun boxes for intense science work. How many times does it have to be said that a heterogenous network is harder to take down before people stop writing this shit?
As for the argument that Windows only gets hits more because of popularity... I want to wring the neck of every person I hear saying that. It's a disgusting display of post-modernist logic to computers. It's the IT variation of the post-modern attitude that there are no absolutes on morals, only relative standards that vary by cultural and personal views. It's a complete rejection of the concept that two systems can be designed such that one is inherently insecure because of its archetecture and that one is very secure by its design.
Secrecy != Security (Score:2, Insightful)
Re:Sounds like someone trying to by controversial. (Score:4, Insightful)
Think about that outside the zealotry mode for a minute. I don't recall any follow up determining, "Hey this happened X_TIME ago, therefore clean programs should be reinstalled on your machine." Now I support the entire Open Source movement by all means, but think about how many include files, or other files could have been tweaked. Say low level include files, or something similar. There is no one, and I say this COMFORTABLY, no one that checks every program, every line of code on their machine. Sure you could lsof|grep -i listen every here and there to see what's what, but a covert chan can hide that. Look I don't want to get into a sysadmin/secadmin shootout here it'd be a draw and I don't care who you are, but... In my eyes, there is still a long way to go.
Take a look at cpan and some of the modules you have on your machine. How many are updated with normalcy? What about the whole sourceforge/freshmeat concept of 'sysadmining', where you find a neat program supported for what... a year? Maybe 2 if you're lucky... Sometimes it seems the cool Open Source gets, the more issues come out with it...
Every step you take... someone is watching you [politrix.org]
Proprietary vs Open Source (Score:5, Insightful)
What bothers me most about these typical "OS vs Proprietary" flamewars-in-waiting is when writers compare specific applications with some nebulous "Open Source" concept. You've all seen reviews that go something like this:
Open Source programs have serious problems. For example, I downloaded an Open Source command-line HTML-parser written by an undergraduate student. After feeding it random non-HTML files, the program crashed roughly half the time. By contrast, I evaluated the latest copy of Adobe Photoshop for Windows. Photoshop easily helped me modify my vacation photos, without a single glitch. Clearly, Proprietary applications are better suited for the market.
Most of the time, these writers compare all open source programs -- many of which are hobby projects -- to individual, highly-polished applications. Hardly fair and unbiased.
(now goes off to read the article)
No evidence (Score:5, Insightful)
It's interesting how he provides absolutely no evidence to support his claims. Obviously, nobody could take his stance and try to argue evidence, or else they would run into piles of evidence suggesting the exact opposite. This is sheer uninformed speculation. A couple choice quotes:
Same way people would know if someone was running a heroin production lab in the middle of Times Square. Open means open. If people create software designed to subvert security, they make closed software. Exhibit A: Gator/GAIN.
Anyone who wants to. Clearly this person has no idea how Free/Open-Source software works at all.
Finally, I get to be pro-OSS! :) (Score:2, Insightful)
Maybe his article should be re-written to say "prosecuting fraud in the OSS world is likely to be more difficult for Governments than if they have a big fat company to hammer..."
LOL, his arguments are ridiculously easy to deconstruct. Not even worthy of an attempt, especially since his article is entirely based upon opinoin (stupidly faulty at that.)
You do get what you pay for... (Score:2, Insightful)
Counterargument Case Study: Diebold (Score:3, Insightful)
"How would anyone know?" (Score:5, Insightful)
Because anyone can create and market--or give away--a Linux distribution, there's also a reasonably high risk that someone will create a distribution specifically intended to subvert security. And how would anyone know?
Oh, I don't know... maybe by looking at the source code?
Turn it around now: Suppose a private company sold software with malicious code included to subvert security. How would anyone outside the company know?
TheFrood
Re:What a sellout (Score:4, Insightful)
Or your elections? [blackboxvoting.com]
Re:Who's paying DevX to write this shit? (Score:3, Insightful)
This article sounds so very 1998ish, when the FUD machines were pumping at full speed.
It seems these days the thing most nearly approaching FUD out of MS is statistics. You know - those banner ads stating that Windows is 11-22% cheaper to operate than Linux.
Re:Sounds like someone trying to by controversial. (Score:2, Insightful)
They would have to release it public. Releasing a program source under the GPL, then not releasing the next version under the same cannot be done AFAIK.
Deriving (ie Version 2) would automagically fall under the GPL and would have to be released.
This isn't journalism. It's ignorance and/or stupidity.
Email to article author & site editors (Score:4, Insightful)
Dear Mr. Russell Jones,
In your article you make a number of interesting points, which I shall attempt to cover in order:
1. An open source product will eventually contain a maliciously inserted security breach.
On what grounds do you base this statement? How can you be certain that Microsoft haven't been paid by the CIA to place backdoors in Windows? Why, then, should any government which isn't in on such secrets trust Windows? How could a government be certain that it knew all such secrets?
2. The core project code could be compromised.
Quite true. However, there have been instances in the past where Microsoft's code has been compromised even when sitting on Microsoft's servers:
http://www.theregister.co.uk/content/4/14265.html
3. A distribution will be built with security holes for the express purpose of selling to governments.
How do you know this hasn't already happened with Windows? You speculate much, but back up little. What kind of advertising budget would such a hacker require for gaining government mindshare?
4. Insiders could "customise" a well-respected secure distribution.
They already can. It's called "leaving accounts on the system". Or "logic bombs". Or "misconfigured systems". This problem has existed for almost as long as computers have.
5. Finally, you speculate that nobody is "watching the watchers". What, however, you appear to have misunderstood is that the government organisation would have a full copy of the source code and could compile it themselves to confirm the resulting program is identical to the shipped version. They could then audit the source code - either in-house or pay an outside organisation.
It is quite correct to state that "you have to put your trust in someone - who should you trust?". Otherwise the country would have to be run on every level entirely by one person, who would be responsible for writing, implementing and enforcing law. I'm not from the US but I'm sure your President would get tired of writing out all those speeding tickets!
I would argue "you should trust someone who can prove they have nothing to hide".
Open Source has nothing to hide. Come into the light.
Re:Um, yeah (Score:3, Insightful)
Alcatel Omniswitch AOS (prop)
Borland Interbase (open source)
Microsoft RPC Interface (prop)
Microsoft IE exploits (prop)
Sendmail 8.12.6 trojan (open source)
So it looks like there is some truth to the article. I would also count Microsoft Word and Excel macros as a commonly exploited backdoor.
Re:My God! (Score:3, Insightful)
Yah... wouldn't source code that's not public be... closed source?
So he's claiming that open source is dangerous because it could become closed source. And closed source is better, because it's more protected against... uh... wait.
Brilliant! What a moron.
Re:Sounds like someone trying to by controversial. (Score:2, Insightful)
Factoid (looks roughly like a fact might).
It is good advice! Really. (Score:3, Insightful)
What, does this guy think some government is going to trust its infrastructure to some home-grown distro that they downloaded off the 'net for free? Please.
Re:Sounds like someone trying to by controversial. (Score:3, Insightful)
So? If they don't get publicity, they're not worth fixing?
Re:Fear Outlook Express for Linux... (Score:2, Insightful)
Because it's an easier argument to make, sure.
But it's a logical phallacy, arguing from the specific to the general. Linux is more scrutinized and secure than windows, therefore all Open Source must be.
I see OSS being a better model for large, high profile projects, like Linux or OpenOffice.
But SourceForge is chock full of little do nothing apps that nobody gives a rats ass about. Who knows what kind of goofy code has been buried in one of those billions of throwaway, weekend projects? Noone is auditing that stuff.
Someone could one day stumble across a little app, and say "hey cool, an app to rename all my mp3 files!", and find out later that it repartitioned his hard drive, raped his hamster, and left the toilet seat up. Either by fault or by purpose.
There is, however, a reasonable assumption that if you pay 10 bucks for a box on the shelf at Best Buy, that such bad things wont happen, and if they did, you have someone to hold accountable for it.
An argument that didn't make sense... (Score:3, Insightful)
------------
Huh?
Microsoft isn't open last I checked. Hackers don't seem to have any problem with causing havok with a 'closed source' product.
------------
He says that this makes adoption of open source software by governments particularly worrisome. In his words: 'An old adage that governments would be well-served to heed is: You get what you pay for. When you rely on free or low-cost products, you often get the shaft, and that, in my opinion, is exactly what governments are on track to get.'"
-------------
Ok, I give. You get what you pay for? I've heard this for many years. I don't see my fast food burgers quite as large as the pictures nor do I see other items I pay for performing as advertised (cite Microsoft again). Not to bash these guys but think about it. How often has my IE browser links been jacked to some other site or a virus/worm trashed my up to date and patched system?
Microsoft has done great things for the industry however closed source isn't any more secure apparently.
Re:Interesting article... (Score:3, Insightful)
What a crock of an "article" that is. It's a group of posts on an OpenBSD mailing list. There is no response to the particular posting made (which, btw, is here [monkey.org], two levels down from what the poster linked to) because the mailing list maintainers shut down the thread as off-topic (appropriately). There are some funny, and valid, points raised by the article you linked to, but "GCC is destructive" isn't one of them.
There are still numerous other C/C++ compilers available. Yes, gcc comes with most distros. So? You can install a different one easily enough. And there are several available -- Intel, Watcomm, Borland, etc. Some are free, some are not. Most outperform gcc in various areas, sometimes in all areas. And, contrary to the post, there is still choice of compilers on Unix -- generally you can choose either the vendor's own compiler or gcc. Which is a vast improvement over the old situation -- you got to use the vendor's compiler. Which usually sucked (they've improved greatly, but we use g++ here because xlC v5 does an amazingly bad job at handling templates).
Yes, some embedded platforms only have gcc available now. Why? Because it's cheaper than rolling your own... it used to be that you had to purchase a compiler for an embedded platform. While this was an additional revenue stream for the company, the cost of building your own compiler, keeping it bug free, updating it to match emerging standards, and providing support vastly outweighed the revenue coming in. Sure, you still have to submit the platform specific code to the gcc-devel group, but it's a lot less work than writing your own. And, of course, gcc provides far better code (stability, speed, and size) than most of the custom compilers.
Can it be said that Mozilla has in effect done an "Internet explorer" with the open source world?
No. There's still Konquerer and Safari (same codebase), there's Opera (commercial and closed), and several others. Don't like Mozilla? Pick another one.
The reality is, open source only destroys the market for other tools when the other tools are inferior. It may be that, eventually, the open source software is superior in every meaningful way and the other tools slide off into obsolecense. At that point you've reached the commoditization point for that group of software... it's unsurprising that the cheapest solution wins. It happens in every other market after all.
Re:Sounds like someone trying to by controversial. (Score:4, Insightful)
Although it doesn't quite fit since this is technically a commentary or opinion piece, in which case, "ignorant fool," would suffice.
Re:Sounds like someone trying to by controversial. (Score:3, Insightful)
The only problem would be if they accepted patches, and the patches are GPLed themselves. The "core group" has to follow the license of anyone who has rights on the code they distribute, i.e. they'd have to get rid of the contribution or comply with its license.
Absurd. (Score:1, Insightful)
There have been software packages that have had backdoors in them for a decade and these were not found until someone open sourced the code.
CERT(R) Advisory CA-2001-01 Interbase Server Contains Compiled-in Back Door Account [cert.org]
Even Microsoft code has been found to have back doors in it:
Netscape Engineers are Weenies [winplanet.com]
Yes, there will be mistakes made. Security is a process, not a state. The biggest mistake would be for a company to assume that software is secure just because it is open source. No, just being open source doesn't sprinkle magic pixie dust on your product, but it does let you get the sources from the vendor, have another firm or your own in house programmers audit the code to ensure that it is back door free and relatively clean and then you build the code yourself.
Before writing opensource software I recommend all programmers read the following:
Secure Programming for Linux and Unix HOWTO [dwheeler.com]
This document covers everything the article covered and a lot more.
As a last note. Open source software is to computer programming as the scientific method is to science. It is a peer review process that slowly results in better and better software over time. Closed source software is like alchemy of the old days. In just 20 years the open source programmers have build entire platforms that can challenge anything that the proprietary programmers can develop. Where will we be in another 20 years? in 100 years? in 1000 years?
Re:Sounds like someone trying to by controversial. (Score:5, Insightful)
They're called .md5s. Use them. They exist for a reason. You'd have to have some godawful cooperation between some very mean people to successfully pull off a corruption on widely deployed OSS software AND not throw red flags up among people who have clean versions and clean md5 hashes.
And, what's you're point on stagnant OSS projects? I don't see Microsoft supporting Win3.1 anymore, but there's a lot of people still using that. The difference is that NOBODY can go through it and fix it up or make anything of it. If someone decides to pick up the pieces on an abandoned piece of OSS that shows promise they can do that.
I hate when people do this. You didn't raise any issues that aren't a problem with ALL software, yet you are applying them specifically to OSS. If a server gets owned, it gets owned. It doesn't matter if it's commerical/proprietary, commercial/oss, or whatever. It's owned. Binaries can still be injected with malicious code. They're owned. Give it up. There's no inherent flaw in OSS.
Re:Sounds like someone trying to by controversial. (Score:5, Insightful)
And while I'm not a free or open source fanatic, I have to say that I can't marshall any rational arguments that the commercial program is somehow safer from authorial corruption. It's virtually inconceivable that a large scale open-source program could have a backdoor or anything like that in it for any significant amount of time, and as for smaller projects, a one-man open source project may be just as likely to be corrupted as the one-man closed source product, but which is more likely to be detected before significant damage is done? The one with the source you can look at, hands down. (And the phrase "just as likely" is for rhetorical purposes; in the real world, the prospect of revealing the source surely impedes anybody who would put something nasty in there! That's way too accountable for someone like that's taste!)
No system can be made perfectly safe. But to claim that commercial software is safer from deliberate authorial corruption takes willful and deliberate ignorance. I mean, seriously, claiming that the software I can't see, that I'm not allowed to see, is more likely to be pure then the stuff anybody (or anybody I hire) can look at is? That flies in the face of both logic and common sense, and is the kind of claim that has be inflated into an long article to blind the reader with words before it can even come close to being seriously entertained; a paragraph summary doesn't pass the laugh test.
And remember, it's not only "Will it happen?", but "Which will do more damage?" Even when break-ins happen in Open Source, the damage is typically swiftly controlled; people's reputations are on the line! Who even knows how much closed-source damage has been caused from breakins? Again, people's reputations are on the line, and the incentives to cover such things up are high.
I just don't see a way, even in theory, where commercial software is safer against this sort of attack.
Re:Russell seems a bit dated (Score:2, Insightful)
Free or low cost? (Score:2, Insightful)
Re:Russell seems a bit dated (Score:3, Insightful)
Re:I can poke some big holes in this argument... (Score:3, Insightful)
The fact is that all OS's are vulnerable to the same types of attacks. It has nothing to do with open vs. closed source and everything to do with bad programing. Sure it's easy for a hacker to poke through open source code and look for unchecked buffers to launch attacks at but then again a white hat could just as easily pick that up and fix it. With closed source software, while it may be tricker to figure out where the unchecked buffers are, there are going to be fewer goodguys looking for them.
The real problem is that we test software to make sure it does what it's supposed to do while hackers look for where the software does what it's not supposed to do. That's why the hackers are always one step ahead because we're looking at the problem from the wrong perspective.
Re:Everyone run out and by XP Home, like me! (Score:0, Insightful)
Re:Sounds like someone trying to by controversial. (Score:5, Insightful)
Open source - starts off, lots of exploits because the code is readily available. People using the package (assuming it's valuable enough to merit it) fix problem, submit patches. Over time software becomes more secure.
Closed source - Exploits harder to find, eventually found due to sheer perseverance of legions of script kiddies and their slightly more talented bretheren. Company denies existence of problem, patches discreetly and only occasionally, eventually begins to become marginalized due to shoddy business practices, begins suing everyone in sight in a sad attempt to revive an obviously dying business. Meanwhile, Bill Gates rolls over in his sleep, makes another fifteen million dollars.
(Or maybe I've just had too much coffee today, and am being silly. Time will tell.)
Re:Sounds like someone trying to by controversial. (Score:5, Insightful)
A small and ever-decreasing percentage of users compile their own binaries, let alone check the result. Also, not all of the exploits appear only in the binary; in at least one case the malefactors added a fairly hard-to-notice security hole to the CVS source, so the "official" binaries and checksums matched just fine.
Jeezus, talk about ignorant (Score:2, Insightful)
Does A. Russell Jones know anything about security??? It doesn't appear so from this article. This reads like something written by some un-informed CNN reporter from 1989. Did this guy do any investigation before spewing forth such ignorant dribble???
Governments "get what they pay for"? Are you kidding me? Governments typically pay FAR MORE for FAR LESS than any other organizations on the planet! Mainly due to incompetent employees paid on time of service rather than actual performance.
"sooner or later, governments that rely on free open source software will put their country's and their citizens' data in harm's way." Yea, so let's stick with the far more secure options of MS-Windows, etc...
"Instead, the security breach will be placed into the open source software from inside, by someone working on the project." Yea, cause there has never been an instance of a paid employee/developer inserting an Easter egg, back door, or other malicious code.
"As anyone can create and market a distribution, it's not far-fetched to imagine a version subsidized and supported by organizations that may not have U.S. or other government interests at heart." I know my government is mostly stupid and ignorant, but I doubt "Joe's garageware jonix distribution" would make it through the laborious bidding process.
"the widespread perception that Linux is more secure than Windows, despite the fact that both products are riddled with software security holes." Agreed. The difference is, we can actually learn about the presence of open-source holes MUCH faster than closed source. (See recent /.ed article [eeye.com]!)
"Can Self-Policing Work?" Of course not! And that's exactly what closed-source is: self-policing! Open-source is open policing and scrutinizing by virtually anyone and everyone. Hmmmm... Should I rely on the QA/security efforts of a 10-20 person team who better play good politics to keep their jobs and/or get raises? OR, Should I consider the QA/security efforts of 100's of thousands of unapologetic experts?
This article is that which promotes growth... (Score:3, Insightful)
Fertilizer. Nothing but fertilizer.
The author's point seems to be that because Open Source software allows anyone to contribute code, that the chance for an "agent provocateur" to insert malicious code into a project is large, and that the use of such code by governments could result in significant security risks.
Let's forget for a moment that the author doesn't actually cite even a single instance of this actually occurring.
The real question is: is this any less likely in systems which are developed in the closed source/commercial world? Does the author believe that potential info-terrorists can't work to place themselves into companies where they might be able to achieve similar ends? It might be more difficult, but once achieved the chance of detection would seem to be significantly lower, since only a very select few get to view the source code in question, and they aren't necessarily motivated by security concerns (they are concerned with pushing their software out the door for sale).
Ask yourself this question: are companies like Microsoft more responsive to security bug alerts, or is Linux?
The author also writes:
Again, a similar question should be asked: isn't this a similar problem for closed source/commercial development, where it might be in the best interest of the company to either ignore or cover up significant security breaches, and where the cause of such breaches are hidden from the eyes of those qualified to perform security audits?The author asks the question "Who is watching the watchers?". The answer is simple: everyone is. Or at least everyone can, which is perhaps the best that can be done.
Re:Sounds like someone trying to by controversial. (Score:3, Insightful)
As has been shown repeatedly, if you have a few guys writing closed code, they can put in pretty much whatever they want . Malicious intent can only be gleaned through a black box analysis. The problems become even greater where many people are working on code. Often companies will not pay for full code reviews, and only broad regression tests by third party, generally QA. Few companies will check for features that are not supposed to exist. Even if the company knows exactly what the software is doing, which is in fact never true, the user still has little assurance that the company is disclosing all features.
So, OSS software is still no worse off. Even if there is no formal code review of new submissions, interested parties can do informal code reviews. Blackbox analysis can still be done, but now offending code can be identified. Best of all, if you so choose, you can remove the troublesome feature and continue to use the rest of the functionality.
The stuff we download off the net, whether closed or open source, is always risky. We are assuming the coders are good guys. OSS is probably a little more trustworthy because there is no hiding behind technicalities. OSS is saying yes to all information requests, not cowardly hiding behind a policy of secrecy.
Closed source can be just as bad. (Score:5, Insightful)
None whatsoever.
Remember those old ATI drivers that ran special "optimizations" when used with the quake3a binary? They were closed source and geared to misrepresent the performance of their card to the community. I suspect that if those drivers were open source that little trick wouldn't have gone unnoticed for long.
I'm not advocating open source as the end all and be all of things, because it isn't. However, you're an idiot if you think that paying for something means that it's safe.
For gods sake, look at IE.
Re:Sounds like someone trying to by controversial. (Score:1, Insightful)
And I happen to recall a call going out saying 'We know we were 0wn3d on this date. Who has MD5 sums from before this date?'
Honestly, you haven't added anything to this discussion. The concept of 'tainted source' is not new to the Open Source community. In terms of submission of patches, people actually *read* the code in question before adding it to existing code repositories. In terms of breakins, the code in question is assumed to be corrupted, only being certified 'clean' after it can be MD5'ed against a pre-breakin archive.
Yes, you are right that noone reads every line of every program they compile/install. However, that's not the issue at hand here, because noone reads every line of every program they buy off the shelf. After all, with closed-source systems, you don't even have the option.
Also, abandonware is another moot point. If you don't want your favorite project abandoned, contribute to it. Again, this isn't even something you have a choice about in the closed-source world.
Yeesh.
Old argument. I first heard it in 1995. (Score:1, Insightful)
The funny thing is that I heard this argument back in 1995 from a guy that tought 555.555.555.555 would be a valid IP address. His argument was exactly the same: "you get what you pay for". But it's a self-defeating argument, since it would make you choose AIX, HP/UX, and Oracle over much less expensive Microsoft products.
Irony of ironies, Microsoft products may be the only good example of getting just what you paid for. And sometimes even less.
Besides, Microsoft advocates should not attempt to polarize the argument, since Microsoft is the guy in the middle when it comes to price.
In short - I'm the ignorant executive editor (Score:3, Insightful)
Photoshop, HP, etc hidden currency counterfit code VS the Gimp.
Trust that Microsoft won't embed heavilty encrypted code that causes problems with Mozilla, etc as has been documented many times before.
In short, open source free and low-cost software products are likely to be widely adopted in governments, where spending public money for licenses is a difficult justification. Inevitably, that choice will lead to security breaches that will cost those same governments (and ultimately you), huge amounts of money to rectify.
He never heard of a virus? EXE's are not that hard to change, and if you take the copy mechanism out, it's very easy to create a trojan from any given binary and even encrypt it. Source Code doesn't give you any magic way to corrupt a program, any more then a binary does. You have to trust the source, but in general 99% of the time there isn't anything to be worried about.
If he is this paranoid, the only solution is for the governemt to write their own operating system, monitor everyone's computers, library reading habits, television viewing and email. Only then can we TRUST that we will be safe.
So obvious... Maybe they are just hoping to sell more ads. Too bad for Mozilla and Adblock.
Re:Sounds like someone trying to by controversial. (Score:2, Insightful)
Due you think it is wise to wait and see if something acts strangely before doing something about it. How long do you think it would take you to notice that something was "behaving strangely" after all your files have been removed?
You need to widen your focus, the world is not comprised solely of developers and sysadmins....
Where do you sign up? (Score:3, Insightful)
Government has the ability to review, or hire someone to review, the source code they're going to use for an implementation and there are even gov admins who know how to do source control and compile software (shock, gasp, disbelief). They also know how to monitor their systems for suspicious activity.
Unfortunately, the model breaks down as soon as the core group involved in a project or distribution decides to corrupt the source, because they simply won't make the corrupted version public
What's he trying to say? They're not going to release the code for a public version of...what? And if they don't make the corrupt version public, what's the problem? Are they going to sneak it in to a government office and while the admin is looking the other way jam a thumb drive on the server? A-ha! Gotcha! What are they going to release if not the source code? And when the checksums and file sizes don't match they'll cover that how? Here's a new version of Mozilla, don't worry about the source code, just install this...whatever...and trust us.
Maybe some of you closer to the daily process can help me think of a scenario where that could happen, because I can't.
If someone is making living writing crap like that, I'm definitely on the wrong end of the business.
Re:Sounds like someone trying to by controversial. (Score:3, Insightful)
Re:Sounds like someone trying to by controversial. (Score:5, Insightful)
Compare:
50% of 10 is 5
I'd much rather have
It takes very few to notice something peculiar and investigate. The malefactors get caught out if anybody notices anything. Since anybody can examine everything of interest, it would be extremely difficult for a malefactor to actually accomplish much of anything against Open Source.
Exactly- MS's Shared Source Init a response to OSS (Score:5, Insightful)
I just sent the author this email (Score:3, Insightful)
With open source software, anyone can in theory contribute code, but in practice there are two strong limits on abuse: open source projects are actually closely controlled by a core set of trusted developers, so outsiders can't submit code directly into the repository, and anyone who is concerned can inspect the code. So, to actually get an intentional flaw into an open source project, one would have to spend time becoming a trusted developer, then construct a flaw subtle enough that it would not be detected by other developers working on the project. And because the process is completely transparent and thoroughly auditable, once any intentional code defects are located the source can be determined and addressed, other code from the same source inspected, and so on. So while in theory there's the risk that you mention, it doesn't seem to actually occur.
With closed source software, in theory access to the source code is limited to trusted employees, but in practice most software companies are fairly easy to penetrate (via new hires, consultants, and outsourcing) so that a malicious engineer could gain access to the source code and submit changes, and for most closed source projects there is far less peer review of the code, so those changes are less likely to be noticed. And since there is no public visibility into the situation, there is less incentive to fix the actual problem, and technical concerns can be overridden by business goals. You can read the widely disseminated Diebold emails for an example of this sort of thinking. So while in theory closed source software might seem better controlled, in practice there are numerous occurrances of engineers injecting code into their projects for personal gain (in Nevada, for example, they regularly catch engineers inserting "cheats" into gambling machines, sometimes after amassing small fortunes).
The end result is that in practice, open source projects have much less trouble with errant code getting into their projects than do closed source projects.
While I believe that "you get what you pay for" is generally good advice, I think that you're missing the ways that companies "pay for" open source software, i.e. by "barter" rather than cash. The many companies using open source software all "pay for" the development of the operating system, but they do so through contributing engineering effort (e.g. IBM, SGI, HP) and by submitting bug reports, rather than by paying a vendor to do the engineering and testing. Of course, many companies purchase support contracts for open source software, in which case they're "getting what they pay for" through the more traditional mechanism of money. So you're not getting something for nothing -- you're just paying by effort, or by purchasing a support contract, instead of for software licensing costs.
When companies that I've been with have used open source software it's rarely for the simplistic reason that there's no purchase price -- it is because the total cost of ownership is lower. I've run extremely large server farms of a wide range of operating systems (NT, BSD, Linux, Solaris, Digital UNIX, HP/UX, etc.) and in every case the purchase price of the software was insignificant compared to the operational costs (hardware, staffing, etc.). Rather pleasantly, open source systems have matured to the point where they're not only easier and less expensive to acquire (no vendor negotiations, etc.) but are often as low or lower in cost to deploy and operate, and as efficient or more efficient. Of course, the specific situations shape the issues -- if you need an enterprise class database, MySQL isn't an option, and if your application only runs on NT, you run NT. But in my experience, when picking between comparable open and closed source solutions, it's better for the customer to pick the open source solution and spend the offset licensing fees on staff or training.
Re:Sounds like someone trying to by controversial. (Score:2, Insightful)
Re:Sounds like someone trying to by controversial. (Score:3, Insightful)
True. Obvious.
What's maybe not so obvious is the less you have to trust the vendor, the better.
Contrast:
[ ] Always trust Microsoft
[ ] Always trust RedHat
Why the ^%*^&%&* should I have to trust RedHat?
Methinks that an essential part of any con game is that the victim must trust the con artist.
Re:Sounds like someone trying to by controversial. (Score:5, Insightful)
Some do. I'm proof by existance.
There is a way to get the work out. /., USENET, mailing lists and distro alerts are just a few ways.
As for the malware in the source, you are of course correct. However, it is exposed, so therefore can be found. In fact, will be found, eventually.
My letter to these folks (Score:5, Insightful)
The claim that Open Source Projects are especially vulnerable to infiltration by folks with malicious intent strikes me as strange.
We have large companies like Oracle and Microsoft extremely dependent upon technical help from politically volatile parts of the world(i.e. India/Pakistan where there was serious threat of nuclear war not long ago)--places where criminal terrorist organizations can operations they can't in a developed country. In India, there are for example tens of thousands of people that have been declared legally dead so someone can seize their property-and the victims can't clear up the issue years later.
It isn't an issue of intent. Some overseas criminal organizations have a reputation for blackmailing their countrymen that don't want to participate in criminal activity-holding relatives as hostage.
Can the average US company really do an effective background check in this kind of environment?
With an open source project, at least I have a reasonable chance of understanding who the actually engineers of project are-and I can judge the security based on the reputations of the people involved. I _can_ get independent examination of the code involved if I'm willing to pay for the service.
Large "US" companies have this habit of substituting the cheapest possible resources with no consideration of long term consequences. How much is the word of a Larry Ellison or Bill Gates really worth on the subject of security? Would you bet your life on their judgement?
Re:Sounds like someone trying to by controversial. (Score:5, Insightful)
Developers and sysadmins are the only ones who are going to notice anyway...my mom doesn't think about whether or not her new program does just what it says it will, and wouldn't update it, or ever be aware of this type of problem unless somebody told her about it.
Do you think Microsoft finds most of the vulnerabilities in it's products, or the legion of geeks out there?
Re:Sounds like someone trying to by controversial. (Score:3, Insightful)
If all someone does is check an MD5 on the executable they produce, they wasted their time and might as well have fetched the binary because nothing they build on their own is likely to match the official binary's MD5 anyway. The only real way to guarantee integrity is to require that every checked-in version of every file be signed using a trusted developer's key that is not stored on the public server. Far fewer than 100K people are even capable of doing such a check for any project without resulting in gazillions of false alarms that would only make it harder to spot the one real intrusion; realistically it will only be done by someone on the project's dev team. In other words, about the same number of people are really doing an effective check on an open-source project as would be doing one on a closed-source project. Given that a source-level exploit is more likely to occur in the first place when the source is widely and anonymously available, I'd say this indicates a danger that really is greater for open source. That doesn't mean open source is generally less secure; it just means that this one scenario does not favor them. The sourceforge etc. exploits demonstrate the danger of source exploits, and the open source community would be better off recognizing it than denying it.
Re:It's a moot point... (Score:2, Insightful)
Wether it's Microsoft, IBM, hell! It could be Apple. I'd just want to get to know them reeal good before doing anything like that with them.
Re:Sounds like someone trying to by controversial. (Score:2, Insightful)
Open source model is hardly perfect (Score:3, Insightful)
I submit another very realistic possibility:
Open source - starts off with lots of exploits, remains with lots of exploits because more 'community' resourses are being spent on breaking it than fixing it. Over time, software becomes irrelevant.
Closed source (and all closed sourse software is developed by Microsoft, ya know) - Exploits are harder to find, but are eventually exploited by people with nothing better to do with their time. Company patches discreetly, and over time, software becomes more secure, and company programming techniques become more refined.
Now I'm not trying to make generalizations as the parent apparently is. I just wanted to point out that both models have their merits and flaws, regardless of the zealots who suggest that one system is perfect.
My very simply reply to that concept... (Score:3, Insightful)
I've had more luck getting and giving support for open source products then I have for ones I actually paid for. I'm not saying that paid software sucks just for that reason or anything, there are a ton of products for which theres no open source alternative even coming close, and probably won't for an extremely long time, but don't try to sell the argument that poor support in free software makes it bad when we almost all know from experience how poor the paid support often is.
Trusted sources (Score:5, Insightful)
Our big problem today is that we are running thin on trusted sources for code. In this regard, the open source module is superior in that it easier for trusted sources to monitor open software. As to whether or not trustworthy companies will continue to exist...that is a question outside the open v. closed code question.
One of the really sad developments is that the growing lack of trust in the industry hurts the small companies the hardest. Quite often the small firms are the most trustworthy. Of course, small firms have a high fail rate. People who buy up failed small firms are often the worst wolves in the pack.
Re:Open source model is hardly perfect (Score:4, Insightful)
Re:Open source model is hardly perfect (Score:3, Insightful)
Books by "A. Russell Jones" on Amazon... (Score:5, Insightful)
Mastering ASP.NET with VB.NET
Mastering ASP.Net with Visual C#
Visual Basic Developer's Guide to Asp and IIS
Now, he may be serious with his accusations against open source. His message borders on the evangelical against open source software? A proprietary, Microsoft zealot, which is no better or worse than a rabid Linux Zealot?
There's already a rebuttal editoral on Devx.com's main webpage by another Engineer there.
http://www.devx.com/opensource/Article/20135
Now as to whether this was some kind of publicity stunt to garner more traffic to their website, since before today I'd never heard of them... they've been quite successful. They've probably seen more traffic today than in quite a while, but it seems likes an infantile cry for attention.
Why not? It's obviously that absurd and completely ridiculous claims can be made for public perusal (aka SCO) and gather quite a bit of the media spotlight. It's a precedent already set in our culture that favors glitz and glamor over substance.
Re:Sounds like someone trying to by controversial. (Score:2, Insightful)
However, the diversity, the forkedness of OS software means there are thousands of variations that would all need auditing.
You're not going to get everybody to audit each version. You're not going to be able to register and secure each place along the chain from source to your company's thousand desktops that the software touches base.
Without a trusted source, and tracability, it's all over. And for the most part, a pressed closed-source CD from a commercial outfit has a lot more of the 'opening' for corruption closed than a source repository on the public internet and/or a binary update website at Red Hat.
In a paranoiac's world, a 'trusted source' is necessary for any software distribution method, open or closed souce in origin.
Re:DevX is a division of Jupitermedia Corporation (Score:2, Insightful)
The plot thickens.
Re:No one pays for IE. (Score:3, Insightful)
scripsit BoomerSooner:
A financial transaction may not be required to get the binaries, no. But eventually, they pay.
Why governments use open source (Score:3, Insightful)
Why is it more likely that an open source company installing systems for a large government agency would install malware than an equivalent closed source company? The government agency should be subjecting the computers to some kind of security and quality assurance tests in any case. If they are handling confidential data, the tests become even more rigorous.
Why trust some company from a foreign country over a company from your own country working with source your own people can inspect and compile? The reasons for governments to use open source are: they can build up their own people's technical knowledge doing do, they are then independent from possibly hostile and certainly mercenary foreign corporations, and most importantly, they can check and compile the source for security reasons. Claiming that they wouldn't do such a thing is simply ignoring one of the most important reasons a country would want to use open source in the first place.
Re:Open source model is hardly perfect (Score:4, Insightful)
I like open source too, I sort of don't understand why you thought I didn't (maybe this is a tangent, I don't know). I think redhat is more guaranteeing their professional server software is stable enough for production use, which is why it costs more. Plus having someone on the phone you can call, that's always a benefit to some companies.
Re:Open source model is hardly perfect (Score:2, Insightful)
When the programmers submit their code, who sits in on the code review? Do the VT (verification test) people work closely with the coders? And where do I download the design review document for each new kernel release?
the pay (Score:5, Insightful)
Flawed assumption: There is a direct relation between quality and price.
Why is it wrong? Because in the real world, where some of us still live, many factors aside from quality influence the price. Here is a short list of some:
* Quantity, lowering per-unit-prices
* Price perceptions, i.e. brand vs. no-brand
* Delivery, packaging and other overhead costs
* Regulations, legal costs and other burned money
* Intentional price modifications, i.e. dumping
And then, of course, the entire logic only applies to things that are actually sold. Any math person knows that comparisons with zero are always dangerous. Quick, what's two times zero? Maybe we should just double the price for Linux, then (in his eyes) it becomes twice as good.
Re:Sounds like someone trying to by controversial. (Score:3, Insightful)
"Let's say Vendor X gets a contract to provide a government agency with 800 desktop computers, with Windows, Office, etc. Meeting a bunch of carefully written specs from that agency's IT department. Vendor X takes Windows XP and customizes it, complete with a "Foo Agency" splash screen, encrypted disk partitions, escrowed bypass for crypto, etc.
"How do we know they didn't plant malware in Windows? What geeks will have access to this binary? Geeks won't even know this mini-distro exists. "
The problem with your example, and with the article that preceded this thread, is that it discusses problems that are common to both open and closed source. The real question is "how can we trust contractors to not screw us". Blaming open-source is disingenuous.
Re:Sounds like someone trying to by controversial. (Score:3, Insightful)
Where is the documented review process for closed-source software? Are the reviewers in THAT process qualified? Who decides that they are? How even is the quality in closed-source software, and how would you prove it one way or another?
Who's accountable? Well, ultimately (just as with most closed-source software), the user of the software is solely responsible for whatever the software does. If you're talking about "accountability" in terms of "who do I sue?", then I would assume that you would sue the company that packages your particular piece of software. I'm pretty sure most of those companies that are reputable enough to have lawsuits filed against them in the event of some unspecified situation with code will have phone numbers and addresses. If you're a business using software that's not available through some easily identifiable source, then you're operating in the "stupid zone".
I understand the point that you're trying to make, but the argument just doesn't have any teeth. There are too many differences with the way things are in reality for the theory to make any sense.
Re:Oh really? (Score:2, Insightful)
How many times do people have to spew forth security catchphrases before they think about them?
There is no doubt that heterogenous networks decrease risk against a class break, although multiple hardware platforms are not necessarily homogenous - an exploit in OpenSSH, for example, would affect both your Linux and OS X machines and probably your Sun systems as well. However, you've now got four different operating systems and platforms to deal with, so you've widened the base security skills required by your IT team by four. That's four different ways of installing software, four different ways of applying patches, four different places to find out abot vulnerabilities in the first place.
Creating hetereogenous networks create risks as well as reduce them. In many situations, the increase in risk and hassle will outweigh any benefit. The statement "a heterogenous network is harder to take down" is rarely true, and certainly not universal. Personally, I'd rather be running Debian GNU/Linux on all my machines and have a small team of Debian gurus looking after them than have disparate operating systems and need more admins with more chance for mistakes and miscommunication.
Re:Open source model is hardly perfect (Score:2, Insightful)
>Open source - starts off with lots of exploits, remains with lots of exploits because more 'community' resourses are being spent on breaking it than fixing it. Over time, software becomes irrelevant.
Thats really not a very realistic possibility.. Say some software was released and full of exploits that could potentially bring down the human race as we know it. If such software was released, and it served a useful purpose for even a relatively small number of people (just to give it a number, say 1,000), and it was being actively maintained, its exploits would be fixed. Why would anybody taking the time to maintain a project do it to add exploits? And if anybody did just add exploits, why would anybody use the software? Also, if people were in desperate need of the software, but it was being corrupted, someone would fork the project and create a trustworthy distribution.
Open Source works because it does what people want it to do. If a project doesn't do what people want it to do, it is either abandoned (and good riddance, anyway), or someone will pick up the ball and make it do what they want it to do.
But you knew that.. You couldn't honestly believe the community would spend more time destroying the value of its software than adding to it, could you?
Hax.