Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Security

Is Open Source Fertile Ground for Foul Play? 723

jsrjsr writes "In an article DevX.com entitled Open Source Is Fertile Ground for Foul Play, W. Russell Jones argues that open source software is bad stuff. He argues that open source software, because of its very openness, will inevitably lead to security concerns. He says that this makes adoption of open source software by governments particularly worrisome. In his words: 'An old adage that governments would be well-served to heed is: You get what you pay for. When you rely on free or low-cost products, you often get the shaft, and that, in my opinion, is exactly what governments are on track to get.'"
This discussion has been archived. No new comments can be posted.

Is Open Source Fertile Ground for Foul Play?

Comments Filter:
  • Re:What a sellout (Score:1, Interesting)

    by DR SoB ( 749180 ) on Thursday February 12, 2004 @05:09PM (#8261846) Journal
    True, but with closed source, at least you know who exactly is responsible.. I support open source, but come on guys, would you really want Linux supporting your nuclear arsonal? Or anything else to do with Bombs? Not _all_ closed source is bad, just because you don't like microsoft.
    I would feel much better knowing that they were using z/OS or some type of source from IBM. Or if they are going to use open source, hire the man power, to double check all the security related code...
  • by xutopia ( 469129 ) on Thursday February 12, 2004 @05:10PM (#8261852) Homepage
    it currently has a score of 2/5. Once the /. effect is done we should all create an account and rate it as low as it can go.
  • by haystor ( 102186 ) on Thursday February 12, 2004 @05:10PM (#8261871)
    You may pay nothing for Linux (for example).

    But you also pay $0 to MicroSoft to insure you against bad things happening to your computer/network.

    The only thing you pay for with MS is basically that it will install an OS on your system. Read the EULA, they don't guarantee much else, and they certainly take no responsibility for things going wrong.
  • Re:What a sellout (Score:4, Interesting)

    by tomstdenis ( 446163 ) <tomstdenis@gma[ ]com ['il.' in gap]> on Thursday February 12, 2004 @05:12PM (#8261894) Homepage
    Oh yeah, see this [slashdot.org] for a good example of closed source software in action.

    Tom
  • by JaredOfEuropa ( 526365 ) on Thursday February 12, 2004 @05:12PM (#8261895) Journal
    An old adage that governments would be well-served to heed is: You get what you pay for. When you rely on free or low-cost products, you often get the shaft, and that, in my opinion, is exactly what governments are on track to get.
    So far, I think the track records of currently existing operating systems speak for themselves: one particular popular commercial operating system (yes, that one) makes the news almost weekly with another gaping security hole, exploit, or worm doing the rounds. On the other hand, you don't hear a lot about security issues with (wonderfully-free) Linux systems, despite their widespread use as servers.

    A number of governmental institution have chosen Linux not because it is free, but because of another distinct advantage: because it is open-source, they know what they pay for.
  • Best Troll Ever. (Score:5, Interesting)

    by DaveJay ( 133437 ) on Thursday February 12, 2004 @05:14PM (#8261919)
    From the article, annotations added by me:

    >Malevolent code can enter open source software at several levels.

    1. >First, and least worrisome, is that the core project code could be compromised by inclusion of source contributed as a fix or extension. As the core Linux code is carefully scrutinized, that's not terribly likely.

    Not likely indeed. Moving on.

    2. >Much more likely is that distributions will be created and advertised for free, or created with the express purpose of marketing them to governments at cut-rate pricing. As anyone can create and market a distribution, it's not far-fetched to imagine a version subsidized and supported by organizations that may not have U.S. or other government interests at heart.

    Organizations using Open Source Distributions generally purchase a vendor-supplied copy as well as a support contract.

    As an aside, do you suppose non-US countries that use Microsoft products are concerned that Microsoft may not have their country's best interests at heart?

    3. >Third, an individual or group of IT insiders could target a single organization by obtaining a good copy of Linux, and then customizing it for an organization, including malevolent code as they do so. That version would then become the standard version for the organization. Given the prevalence of inter-corporation and inter-governmental spying, and the relatively large numbers of people in a position to accomplish such subterfuge, this last scenario is virtually certain to occur. Worse, these probabilities aren't limited to Linux itself, the same possibilities (and probabilities) exist for every open source software package installed and used on the machines."

    This isn't limited to Open Source itself. The same possibilities (and probabilities) exist for any company that uses customized software AT ALL -- at some point, you have to trust those doing the customizing, or get a third party to audit. I mean, after all, I can wreak havoc throughout an organization just by clever use of login scripts on Windows XP machines, and if everyone in the IT department is in on it, nobody else would be the wiser.

    Now that I think of it, even if you're not customizing the software, you're trusting the people who make it. Does Microsoft have your best interests at heart? Does SCO? Does RedHat? Does anyone? That's why it's nice to be ABLE to scour the code -- the smartest, safest groups will obtain source code from those who write it, and have it audited by another group, and then again perhaps by another. Unless they're all in league with one another. [Insert tinfoil hat here]

    So. Who's paying this guy?

  • by theboy24 ( 687962 ) <theboy24&aol,com> on Thursday February 12, 2004 @05:14PM (#8261922)
    You're Absolutley right. People going around trolling about open source without any plausible reason is a major detriment to the cause and the software. Companies/corps are going to pick whatever works best for them and adapt/change with it to their needs and Gov't should do the same. if the security was as bad as the article implies it to be, then why havent we seen any catastophic security failures on any of the open source systems currently being used by fortune 500 and Gov't. Hell, it couldn't be any worse than the MS systems in use.
  • Sort of (Score:5, Interesting)

    by gerf ( 532474 ) on Thursday February 12, 2004 @05:14PM (#8261938) Journal

    His criticism reminds me of a speaker at a recent IEEE meeting at my school. She talked about the work environment, and some nuances of how to act or not to act.

    One interesting thing about her contracting company she runs, is that if you charge more, you get more business. The thought here is that companies think that since this certain company costs more, it must be better. Obviously though, she did not get smarter by charging more, only richer.

    That is the thinking that this fellow is using: chargine more must mean it's a better product. Sadly, he is in a large part of the population that does not understand the Open Source community, or business models. His view is outdated, and frankly, wrong.

    Besides, what other companies besides M$ find a huge hole in all of their flagship products, but fail to patch it for close to a year?

  • by Angst Badger ( 8636 ) on Thursday February 12, 2004 @05:15PM (#8261962)
    The old saying about getting what you pay was formulated as a result of experience with commercial enterprises. Of course you "get the shaft" with "free" commercial products -- commercial enterprises don't exist for the purpose of giving things away. Companies only give things away in the hopes that you'll actually buy something.

    Open Source projects, on the other hand, are usually formed with the express goal of giving something away. They have every incentive to make their products valuable and no incentive to produce shoddy loss-leaders.

    "You get what you pay for," even with respect to for-sale products, doesn't mean "you get value commensurate with your expenditure". Commercial enterprises are strongly incentivized to give the least possible value for the highest possible price. Extra quality and value, above and beyond the expectations of the customer, is an unnecessary expense to a business. Competition alleviates this somewhat, but companies are still only playing to the level of the competition. Doing the very best possible will seldom if ever be their goal, in contradistinction to Open Source projects, where it is frequently the main goal.
  • Yes. Mr. Jones needs to read up on why governments actually prefer [com.com] open source.
  • Re:What a sellout (Score:2, Interesting)

    by gujju ( 626201 ) on Thursday February 12, 2004 @05:19PM (#8262015)
    On the other hand, do you really want some closed source software handling your elections.
    Would you rather have every GWB hating geek scrutinize the voting machine code with his self assembled electron microscope or some "security" company Diebold do it with closed source software which they CLAIM is "safe"

    Gujju
  • Review process (Score:2, Interesting)

    by unconfused1 ( 173222 ) on Thursday February 12, 2004 @05:20PM (#8262035) Homepage

    Obviously A. Russell Jones is unfamiliar with the review process that happens in most open-source development. It is ridiculous to believe that malicious code would just make its way into an open-source application.

    Really what it seems like he is trying to do is demonize open-source developers...suggesting that it is likely that the group governing an open-source project would deliberately infect their own apps.

    I can see the Apache Group chuckling at his assertions.

  • by Fluid Truth ( 100316 ) on Thursday February 12, 2004 @05:25PM (#8262086)
    I suspect that was because of the recent patch to windows that came out just a few days ago. Hmmm...when was the last time I needed to update the linux server or apache for security reasons? Hmmm...oh well, my memory's not that good, anymore.
  • by Leomania ( 137289 ) on Thursday February 12, 2004 @05:25PM (#8262089) Homepage
    You get what you pay for.

    This is indeed true, but it depends upon how you define 'pay'.

    In the case of the government using open-source software, 'paying' to me means that the underlying code gets reviewed by govenrment employees or trusted subcontractors prior to being deployed, rather than paying cash for closed-source software. It is inconceivable to me that someone could argue that you have this option with closed-source software, or that you are more protected somehow because people getting a paycheck to write code would never do anything malicious. Even if you get to peek at the underlying closed-source code, how do you know that was the code used to compile the application? With open source you can guarantee it 100% by compiling it yourself. How does it get any better with closed-source? (rhetorical question of course...)

    - Leo

  • by simonharvey ( 605068 ) on Thursday February 12, 2004 @05:26PM (#8262098) Homepage
    I was at my pastors house last week and the topic of conversation some how managed to turn over to linux and open source vs. windows and closed source.

    basically the argument for closed source was that nobody could read through the code and exloit weaknesses or add trojans without anybody knowing and once linux becomes more mainstream the same virus woes will be the same for both platforms.

    I waas going to remind him that linux users are stastictally (spelling???) more security concious (how many linux/unix users spend the bulk of there productivity time running as root?) than windows users but i didnt want to bring it up because he was the leader of our church.

    And also more work is put into the linux kernels than in the NT5-5.1 kernels when it comes to the weaknesses that viruses rely on.

    I was then going to remind him of OpenBSD [openbsd.org], an open source OS that has had only 1 hole in the default install in the last seven years.
    maybe next time when i get enough courage I will enlighten him some more.

  • Spyware (Score:2, Interesting)

    by MathFox ( 686808 ) on Thursday February 12, 2004 @05:26PM (#8262104)
    As soon as the Linux kernel starts "phoning home", I can fix it because I have the sources and the GPL allows me. Linus Thorvalds knows that, so he is very reluctant in adding spyware to the kernel.
    When Windows XP starts phoning home, the MS EULA doesn't allow me to do anything about it. Bill Gates knows that and is looking for ways to get more dollars out of his Windows licenses.
  • by uradu ( 10768 ) on Thursday February 12, 2004 @05:28PM (#8262132)
    > Uhhuh? So? They'll be fixed in the next release?

    At the whip of the vendor. Which, in Microsoft's case can be never, unless the "hole" gets publicity on the evening news. There are serious--and well-documented and submitted--bugs in Word that have been there since the early '90s, with no obvious intention from MS to ever fix them.
  • by robbkidd ( 154298 ) on Thursday February 12, 2004 @05:31PM (#8262157)

    [From FUD-Induced Diatribe of an Aritcle:]
    Malevolent code can enter open source software at several levels.
    [1] First, and least worrisome, is that the core project code could be compromised by inclusion of source contributed as a fix or extension. As the core Linux code is carefully scrutinized, that's not terribly likely.

    Sooo... it's
    not likely? Why bring it up then?

    [2?] Much more likely is that distributions will be created and advertised for free, or created with the express purpose of marketing them to governments at cut-rate pricing. As anyone can create and market a distribution, it's not far-fetched to imagine a version subsidized and supported by organizations that may not have U.S. or other government interests at heart.

    Which "the government" probably wouldn't purchase. Jones might not have noticed, but most linux installations run in government and the private sector are from the Big Name distributors. Why? Support contracts and the tendency for proprietary applications that run on Linux to require a particular Big Name distribution to run on.

    [3]Third, an individual or group of IT insiders could target a single organization by obtaining a good copy of Linux, and then customizing it for an organization, including malevolent code as they do so. That version would then become the standard version for the organization. [...]

    Sounds like contract programming to me

    [...] Given the prevalence of inter-corporation and inter-governmental spying, and the relatively large numbers of people in a position to accomplish such subterfuge, this last scenario is virtually certain to occur. Worse, these probabilities aren't limited to Linux itself, the same possibilities (and probabilities) exist for every open source software package installed and used on the machines.

    Right. These probabilities exist for *ANY* software development. Any contract programmer could do the same thing with software written for a closed-source operating system. I recall some statistic (probably made up) that said the vast majority of coding is done for in-house applications: a business' customized product database, a client database, etc. Any "IT insider" could target a government agency, bid on a programming contract and gleefully "accomplish such subterfuge". Until they were caught, charged, imprisoned and became some bad man's girlfriend.
  • by Pragmatix ( 688158 ) on Thursday February 12, 2004 @05:31PM (#8262160)

    I have this argument with my clients all the time. Many of them do not trust open source. They say, 'It is unsupported! We can't run production on unsupported software!'

    My argument is that it is no different from internally developed application. None of the code I write is 'supported' any more than the open source code out there. If something breaks they have to pay me to fix it. If something breaks with some open source code, they still have to pay me to fix it.

    Also, the advantage of open source is that even if the author's slipped something 'nefarious' into the code, you have a chance to see it. What do you do when someone slips spyware into a proprietary application you use?

  • Challenge... (Score:2, Interesting)

    by bretth ( 195183 ) on Thursday February 12, 2004 @05:33PM (#8262171) Homepage
    Of course, if he really believes what he says, he should be able to prove it by injecting bad code into (say) the Linux kernel, or apache.
  • by roseanne ( 541833 ) on Thursday February 12, 2004 @05:35PM (#8262204)
    Not that absence of patches == secure, but IIS hasn't had to be patched in quite some time. In fact, over the past few months, I've been patching more Linux and BSD boxes than Windows, thanks to the SSH+sendmail vulns (yes, we still run sendmail on some boxes, though we've moved to a combination of qmail and exim on others).

    MS software IMO has really improved security-wise, down to sensible, secure-by-default installs (look at the default installs for Windows 2003 or Services for Unix 3.5). Today I rate typical MS *users* are more of a security threat (the kind who spread MyDoom) than MS software itself.

  • by Anonymous Coward on Thursday February 12, 2004 @05:42PM (#8262279)
    Original quote:
    "... an individual or group of IT insiders could target a single organization by obtaining a good copy of Linux, and then customizing it for an organization, including malevolent code as they do so."

    Ok, what if we rule out open source as insecure, as the author does, and rewrite the above:

    Modified quote:
    "... an individual or group of IT insiders could target a single organization by licencing a closed-cource kernel, and then customizing it for an organization, including malevolent code as they do so."

    Ahh, much better! The author sure is right that closed source gives much better security.
  • by Anonymous Coward on Thursday February 12, 2004 @05:43PM (#8262291)
    http://linuxtoday.com/infrastructure/2003040801626 NWBZEM
  • by iSwitched ( 609716 ) on Thursday February 12, 2004 @05:51PM (#8262495)

    Instead of actually discussing the story, any presumed insult of open source is immediately flamed into oblivion. Look - I love open-source as much as the next geek, but how about we talk about this type of article like adults, and provide examples of our own?

    Sure the guy could've taken a less flamatory tone, and could've provided a few specific examples, if there are any, but riddle me this, all you smarties, he does have the grain of an issue here.

    Lets assume that open software becomes ever more mainstream, to the point where grandma can't tell or doesn't care the difference in method by which her email client was developed. What's protecting her against malicious or incompetent open-source developers? Or are we saying that all programmers are by nature 'good' people and also brilliant at their craft?

    Sure, geeks can compile source, compare binaries, review code line-by-line, but it may shock you to know that normal people don't know or care how to do this.

    You're next argument is that the 'good' geeks will discover and root out the 'bad' geeks. But in a world where OSS is mainstream, this will only happen after thousands, hundreds-of-thousands, or even millions of mainstream users are already compromised.

    I'm not saying that commercially developed software has proven itself better, in fact usually its much worse, so far anyway, but OSS does have some of the same problems in a world where not every user is also a programmer.

    OK, discuss...

  • by Fefe ( 6964 ) on Thursday February 12, 2004 @05:58PM (#8262638) Homepage
    I have never understood what those people are thinking when they publish .md5 files. I mean, really! If someone gets far enough to upload a compromised tarball, what stops him from also uploading a matching md5 file?

    Exactly. Nothing.

    That's why people with more than one brain cell upload .sign files. Those are digital signatures made with the GNU privacy guard [gnupg.org]. Digital signatures make sure that the guy who owns the secret key (and only him) can create signatures, which then everyone can check.

    Of course there are also caveats (some dark three-letter agency could have cracked the key with their Roswell quantum computers, or someone could have stolen the secret key), but those are far less likely than some asshat uploading a md5 sum. Everyone can create matching md5 files for any content, but only I can create sign files matching my secret key.

    So please someone hit those GNOME idiots with a clue stick, those md5 files must go. Now.

    Oh, and while you are at it, please also tell the gnome people to use a directory structure where mirror programs (and people!) can see whether there were new uploads without having to recurse through the monstrous moloch directory tree from hell. Thanks.

    Sheesh. Now that wasn't so hard, was it?
  • Re:Take action (Score:3, Interesting)

    by RainbowSix ( 105550 ) on Thursday February 12, 2004 @06:01PM (#8262701) Homepage
    Furthermore, you can visit their
    forum [devx.com]. No replies yet as of this posting. Somebody should write a well thought retort.

    "Think Russell is dead wrong? How does the open source community prevent against the issues raised in this opinion? Tell us in the Talk to the Editors discussion forum."
  • Re:What a sellout (Score:3, Interesting)

    by Salamander ( 33735 ) <jeff@ p l . a t y p.us> on Thursday February 12, 2004 @06:02PM (#8262721) Homepage Journal

    Heh. Even as I wrote that, it looked like the closed-source version of this trick became a lot easier with the leak of NT source. What a coincidence.

  • by StenD ( 34260 ) on Thursday February 12, 2004 @06:11PM (#8262889)
    Just because you didn't hear about it, didn't mean that the concerns weren't raised. In fact, the CERT advisory [cert.org] contains the following statement:
    II. Impact

    The potential exists for an intruder to have inserted back doors, Trojan horses, or other malicious code into the source code distributions of software housed on the compromised system.

    III. Solution

    We encourage sites using the GNU software obtained from the compromised system to verify the integrity of their distribution.

    Sites that mirror the source code are encouraged to verify the integrity of their sources. We also encourage users to inspect any and all other software that may have been downloaded from the compromised site. Note that it is not always sufficient to rely on the timestamps or file sizes when trying to determine whether or not a copy of the file has been modified.
    A referenced Cert Incident Note [cert.org] begins with
    Background

    When downloading software from online repositories, it is important to consider the possibility that the site has been compromised. One of the threats that users face is that intruders could include malicious code in the software packages distributed by those sites. This code could take the form of Trojan horse programs or backdoors.
    In regards to your other concerns:
    Take a look at cpan and some of the modules you have on your machine. How many are updated with normalcy? What about the whole sourceforge/freshmeat concept of 'sysadmining', where you find a neat program supported for what... a year? Maybe 2 if you're lucky...
    Frankly, that's not significantly different than closed source software - companies release products, then, because of lack of adequate revenue, stop updating it. If you're lucky, the company itself didn't go under, so you might still be able to receive support, perhaps at extortionate pricing. If the company went oot of business, and you came to rely upon the product, you're SOL. With OSS, however, if the original developer[s] are no longer developing the package, and noone else has taken charge, you still have the source. If you have a critical need for a fix or an enhancement, you can always contract with a programmer to perform the work to your specifications, which you would be unable to do with a closed source product.
    Sometimes it seems the cool Open Source gets, the more issues come out with it.
    You've yet to cite one that doesn't exist with closed source software as well. Source code repositories are compromised, backdoors are inserted, development ceases, and support is withdrawn with closed source software as well. The difference is that with OSS, the end user has access to the code to protect themselves from these risks, while they do not with closed source software.
  • by kellman ( 8394 ) on Thursday February 12, 2004 @06:22PM (#8263057) Journal
    Right on.

    Was this guy hired by Micro$oft? Seriously.
    His arguments were so unconvincingly and universally applied to both open and closed source software that the whole article seemd like a joke.

    I have yet to see even a *small* example of what he's talking about, but on the other had there's numerous examples of proprietary software having back-doors, exploits and vulnerablities that were not fixed for YEARS after the release of a product.

    Examples:
    1. Pix firewalls. These things have had numerous problems from day one and many were not fixed for many months.

    2. I think it was 3com that had a default password on their switches/routers that anyone could use to access them. This was put in place by the company to allow technicians to service any unit.

    3. The meta-data hidden in M$ Office documents. It has now even been documented by the government (and eventually Micro$oft) how to reduce the amount of meta-data in those documents. Hmm, I don't think this would have been an issue with open-source software.

    There's many, many more examples, but these are they only ones I can think of off the top of my head.

    He also said Linux was riddled with about the same amount of security problems as Windows. In what world? If you look at sheer numbers of vulnerabilities, yes a copy of Windows 2000 (56) has less than a copy of Red Hat Advanced Server 2.1 (109). But look at the actual exploits; most of the Windows problems will allow REMOTE administrative access or complete DOS. The Red Hat/Linux vulnerabilities are largely local application DOS issues and local privilege escalation in an application that usually isn't even running. Not to mention it may not even be installed (oh no! they've compromised mutt!). Conversely, how many Windows machines have been affected by worms compared to Linux machines?

    Additionally, there are many programs on Linux that have their vulnerabilities found and fixed because the source is freely available. How many holes still exist in Windows and are waiting to be discovered?

    All of the real-world proof completely refutes all of his pretenses.

    Bah.
  • Security Audits??? (Score:2, Interesting)

    by Anonymous Coward on Thursday February 12, 2004 @06:22PM (#8263067)
    I work for a major corporation that uses open source, but we don't publish anything into production without doing extensive security testing. This includes third party security audits, and they've ripped apart just about every single vendor's POS (piece of software) that we've installed. At least when they uncover a problem with the open source packages, we can get patches quickly or it's actually a vendor's product that interfaces with Apache, etc. If you're that big an entity with sensitive information and don't follow basic security measures, you're just asking for trouble. I don't think any IT professional in today's world can plead ignorance to security (funding, well, that's a different story) :\

    Just my $0.02
  • You're missing the point. They _know_ when the compromises took place. I had a project on Savannah, and when they discovered the backdoor, the had the CVS repository from backup from before the incident, and from after the incident. Each project leader was to compare the diffs between the two to make sure that there was no altered code.
  • by BranMan ( 29917 ) on Thursday February 12, 2004 @06:24PM (#8263105)
    Actually, in practice there has seldom been any peer reveiw of code in 'closed source' software companies. Unless a project or program has major funding, clout, and visibility, the coders write some unit test cases and hope any bad bugs are caught in system testing (which gets reduced when the schedule gets tight - in contrast Open Source software usually has no schedule). Open Source software is therefore infinitely more secure as more often than not at least 2 pairs of eyes have seen any particular piece of code.
  • by blorg ( 726186 ) on Thursday February 12, 2004 @06:32PM (#8263223)
    "A small and ever-decreasing percentage of users compile their own binaries, let alone check the result."

    I think the government might just have the time to make this sort of check, and as others have said, it only takes one person to notice. Your second point is valid, as is born out by the Debian/micq dispute [markpasc.org] (also mentioned previously in these comments), but that ironically isn't a point that Jones attempted to make in the article - he seems to be concerned with unpublished back-doors that don't appear in the source.

  • by chadjg ( 615827 ) <chadgessele2000@yahooLION.com minus cat> on Thursday February 12, 2004 @06:35PM (#8263251) Journal
    My boss used to do custom business software and database programming back in the big iron days. He said that in order to do customer support they would often build in a way to shell into the machines remotely to do the diagnostics.

    No problem there. But the kicker was that he would build back doors into the programs that only he knew about, so if they changed the front door passwords or otherwise screwed it up, he could still get in.

    The big problem was that he wouldn't tell his customers about these back doors. This is financial and tax data we're talking about. He saw no ethical problem with this. None at all. Fortunately he's not a malicious guy,

    This isn't a suprise to anybody, right? I was just shocked at the total and complete lack of guilt over doing this. And he's otherwise a normal guy. That's scary.

  • note sent (Score:2, Interesting)

    by router ( 28432 ) <a...r@@@gmail...com> on Thursday February 12, 2004 @06:47PM (#8263423) Homepage Journal
    Mr Jones,
    So, a major Closed Source OS vendor including specific checks for software that competes with that vendor's other software offerings and refuses to work or crashes when the competing software is launched is not a possibility? No, its a fact, and Microsoft did it. Articles like these simply allow Open Source Software users and authors to ignore their writers indefinately actually, since it is obvious that authors such as yourself do not understand the core principles of Open Source.

    I have a large number of analogies that might make sense to you, here is one.
    Closed Source:
    I like to work on cars. I have an idea for a car that I would like to build. I build my car. I show it. Painfully over a period of years, from looking at other custom cars, I come up with one that I really like and then maintain it because I enjoy it.

    The Closed Source Analouge:
    I like to code. I have an idea for some code that I would like to write. I write the code and distribute as closed source shareware. Painfully, over a period of years, from user observations and using other code, I come up with something that really serves my needs, that I maintain because I enjoy it.

    Open Source:
    I like to work on cars. I have an idea for a car that I would like to build. I build a prototype of my car. I show it to the world and explain my idea. Other people who like to build cars may or may not help by randomly showing up in my garage and wrenching, bringing cool tools, paint, parts, etc. Other people will suggest improvements or point out flaws. In a matter of months, the initial build is done and I get to use the car I like and copies of my car are available to anyone who wants to test drive it or use it everyday. Further improvements arrive and I oversee their addition to the car. It weighs less, goes faster, is more comfortable, and does things I couldn't have dreamed of because it leverages the skill, talent, and needs of everyone who liked the idea. I maintain it, or allow others to maintain it, because its is a tour de force in the automotive realm and suits my needs better than any other car in existence.

    Open Source Analogue:
    See above, inserting code for car.

    Now, I ask you, would we let anyone run a grinder over my beautiful car? Would we be any less observant of the additions being made than the single shareware author? Would anyone else working on the car allow a malcontent to destroy the engine?

    Once it is out of my hands and in the community, the probability of changes you describe occurring are lost in the noise compared to the probability that a major vendor will try to handicap its competitors. As has been SEEN in the past and will be SEEN in the future. You really shouldn't comment on things you don't truely understand.To believe that people whose hearts and souls are intwined in something have less motive to maintain the purity of their code compared to people who are punching a timeclock and subject to the whims of managers, deadlines, competition, and cost containment is a manifest misunderstanding of the nature of man.

    Stop playing chicken little and take off the tinfoil hat.

    andy
  • by Anonymous Coward on Thursday February 12, 2004 @07:26PM (#8264004)
    When the DOD buys computer hardware and software, they use a set of guidelines and rules based on the so called Orange Book. This mandates verifying the production process, and specifically mentions things like the possibility to introduce backdoors into firmware etc.

    The DOD figured a few decades ago how to deal with that, so don't worry too much about them and computers from China and India... worry about the home machines of their employees, and about yourself tho.

    Btw, eventho it is outdated somewhat, the DOD Orange Book on secure systems is a good read, and is required reading for anyone who has to deal with security.
  • by G27 Radio ( 78394 ) on Thursday February 12, 2004 @07:54PM (#8264275)
    The big problem with the closed source model (as we may be about to find out first hand) is that once the source gets leaked, all those holes are out in public. The security through obscurity design model kinda falls apart at that point.

    The guy that wrote the original article is definately trolling. Unless he really is a fool. I think anyone with even a little insight into how OSS works understands why it's inherently MORE secure than close source. This "closed source is more secure" meme gets floated and shot down several times a year.
  • by plcurechax ( 247883 ) on Thursday February 12, 2004 @08:20PM (#8264553) Homepage
    He says that this makes adoption of open source software by governments particularly worrisome. In his words: 'An old adage that governments would be well-served to heed is: You get what you pay for. When you rely on free or low-cost products, you often get the shaft, and that, in my opinion, is exactly what governments are on track to get.'"

    The federal department I work for is rapidly moving towards open source because we cannot afford to be constantly screwed by the traditional commercial vendors. We simply couldn't afford to keep paying for screw ups by HP, Cisco, Unisys, MCI, Teleglobe, and Dell. Nor could we afford the upgrade cycle recommended by commercial software vendors like Microsoft.

    So we are increasing our in house staff by 3 full-time people - no expensive contractors, and adopting open source to reduce cost, and take control over our infrastructure and in the process improving reliability drastically, saving the taxpayers big dollars on reduced overtime for operational costs, drastically reduce software maintaince costs, and make nearly everyone but Microsoft and friends happy.

  • by Anonymous Coward on Thursday February 12, 2004 @08:43PM (#8264755)
    Most of the issues he raised can be resolved through better security and policing of a projects source code. It's just as likely that a disgruntled hacker at Microsoft or someone working on one of a million other "legitimate" projects could insert a backdoor.

    The article would have served a better purpose by discussing the vulnerability of ALL code bases. I don't see how he can justify saying it's a problem specific to open source.
  • by DahGhostfacedFiddlah ( 470393 ) on Thursday February 12, 2004 @08:46PM (#8264783)
    I've seen stuff like this too. I was bugfixing some PHP code a while back and found this gem:

    if ($long_variable_name == "long string") {
    mysql_query("DELETE FROM important_table1");
    mysql_query("DELETE FROM important_table2");
    mysql_query("DELETE FROM important_table3");
    }

    I can only assume it was put there by the original author to use in case he wasn't paid or saw the script copied or something like that. Regardless, I consider it a gross negligence to allow anyone with the right magic phrase to delete an entire site (I removed it, of course).
  • by Anonymous Coward on Thursday February 12, 2004 @08:57PM (#8264891)
    i think what he's saying is that:

    say today, i am a rogue developer. i implant some bad code into my part of the tree.

    i leave it dormant...for 3 years. An accomplice then uses it to hack 5 servers (which have the 3 year old exploit compiled in).

    >>They _know_ when the compromises took place

    that's right. they think the compromise happened just recently. they'll never think to check far into the past for WHEN the original bad code was implanted. and no one will go back 3 years to check md5sums. they won't even know to check that time frame.

    they'll just compare the md5s before and after the 5 servers were RECENTLY infiltrated...and they'll match, unless they go back 3 years.

    this of course would include closed source just as well as open source. i see no reason why OSS would be any more susceptible to this kind of thing. closed source would be just as susceptible, imho.
  • by crucini ( 98210 ) on Thursday February 12, 2004 @09:10PM (#8265006)
    I don't think you quite understood his scenario. Let's say Vendor X gets a contract to provide a government agency with 800 desktop computers, with Linux, OpenOffice, etc. Meeting a bunch of carefully written specs from that agency's IT department. Vendor X takes Fedora or Gentoo or Debian and customizes it, complete with a "Foo Agency" splash screen, encrypted disk partitions, escrowed bypass for crypto, etc.

    How do we know they didn't plant malware in OpenOffice? What geeks will have access to this binary? Geeks won't even know this mini-distro exists. How much do you know about the Linux being used by Burlington Coat Factory, for example?

    I'm not saying this argument is airtight, just that you didn't really address it.
  • by Negativeions101 ( 706722 ) on Thursday February 12, 2004 @09:12PM (#8265021)
    This is ridiculous. This guy was obviously paid by Microsoft in some way or another. Anyone who knows anything about OSS can tell you that OSS authors, well those of popular OSS projects, have the intent of functionality and stability. There is no agenda other than to make a good product. What? You think the mozilla team is spying on us right now? Of course not. Anyways, if you trust Microsoft over anything then you might as well kill yourself right now.
  • by adrianbaugh ( 696007 ) on Thursday February 12, 2004 @09:30PM (#8265208) Homepage Journal
    Wouldn't help you against a C compiler hack [acm.org] in the style of Ken Thompson's classic. That's a pretty paranoid example but it does show that to be perfectly secure in your system you do need to know everything about it, from the ground up. Compiling from a known-good source isn't always enough.
  • by borgheron ( 172546 ) on Thursday February 12, 2004 @10:05PM (#8265473) Homepage Journal
    Ron,

    I'm going to discuss some of the more glaring issues with your article below:

    "An old adage that governments would be well-served to heed is: You get what
    you pay for. When you rely on free or low-cost products, you often get the
    shaft, and that, in my opinion, is exactly what governments are on track to
    get."

    Much hullaballo has been caused by the use of the word Free in Free Software.
    Please remember it's free as in freedom, not cost. Also remember that major
    players such as IBM, HP, and Dell and numerous smaller companies are actively
    involved in the creation and maintainence of Linux. It's not just a hobbyist
    OS anymore.

    "Eventually--and inevitably--an open source product will be found to contain a
    security breach--not one discovered by hackers, security personnel, or a CS
    student or professor. Instead, the security breach will be placed into the open
    source software from inside, by someone working on the project."

    There are known cases where this has happened on closed-source projects.
    Microsoft Windows, in fact, has many "easter eggs" which are basically hidden
    suprises for the user if he/she hits a certain combination of keys. Even
    these relatively minor "jokes in the code" and potential "security problems"
    wouldn't fly in an open source project since, in order to succeed *all of the
    people involved in the project* would need to be in on the breach.

    Case in point: there was some code which was committed to the Linux kernel a
    while back which would have introduced a security flaw. Within hours of it's
    commit to the repository it was caught by the other maintainers, who determined
    it was a mistake, not a deliberate breach.

    "Because anyone can create and market--or give away--a Linux distribution,
    there's also a reasonably high risk that someone will create a distribution
    specifically intended to subvert security. And how would anyone know?"

    Because they can check the source, and most of us who do use Linux would check
    the source. Any "subversive" distribution would quickly be detected by the
    community at large.

    "I'm not naive enough to think that proprietary commercial operating system
    software doesn't have the same sort of vulnerability, but the barriers to
    implementing them are much higher, because the source is better protected. I
    think such a scenario is far less likely than finding a group of people willing
    and able to create and market a malware open source distribution."

    Your assertion here is incorrect. Since there are fewer people in a company
    to actually vet the software out before it gets released, it's much more likely
    that a problem will get out into the wild before anyone catches it.

    Case in point: Microsoft Window's numerous security bugs. A bug in the IP
    stack of Microsoft Windows is what allowed the CodeRed worm to work it's way
    into so many corporate networks all over the world year before last.

    "Who's Watching the Watchers?"

    All of us.

    In summary, I find your article to be another piece of FUD from someone who is
    either unwilling or not capable of fully understanding Free Software or Open
    Source Software. I find it sad that it passes for news on an otherwise
    respectable site.

    Good day,

    GJC

    =====
    Gregory John Casamento -- CEO/President Open Logic Corp.
    -- bheron on #gnustep, #linuxstep, & #gormtalk ----------------
    Please sign the petition against software patents at:
    http://www.petitiononline.com/pasp01/petition .html
    -- Maintainer of Gorm (featured in April Linux Journal) -------
  • by rnturn ( 11092 ) on Thursday February 12, 2004 @10:24PM (#8265641)

    Now where have we heard of them before?

    Oh, yes. They're the ones associated with Darl McBride's infamous code presentation at CDXPO. So I guess if you can't impune open source development by supporting McBride's inane ramblings, encourage one of your publications to sling a little mud with old, outdated theories that being able to see source code means that the criminal element will be writing exploits for it or infiltrating the kernel develpoment team and inserting backdoors.

    Yes, sir! At DevX and Jupitermedia, security through obscurity is alive and well.

    I couldn't find a single idea in this ``piece'' (oh, it's a piece alright) that was original or to be taken seriously. I suspect that the author just had a flash (``Ooh! Ooh! "Who will guard the guards?" That's clever now I can write an anti-Linux article!) and saw a chance for his employer to get some web page hits.

Intel CPUs are not defective, they just act that way. -- Henry Spencer

Working...