Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Security

Defending Open Source Security 260

dpilgrim writes "DevX's A. Russell Jones as thrown down the gauntlet, questioning the security of Open Source software. I've picked up the gauntlet and posted a response over on the O'Reilly Network. As previously discussed on /. Jones' comments are too controversial to ignore."
This discussion has been archived. No new comments can be posted.

Defending Open Source Security

Comments Filter:
  • by Denyer ( 717613 ) on Saturday February 14, 2004 @06:06AM (#8278725)
    Inclusion of some other major news sources makes the well-structured argument more credible to outside readers.

    Nice article!

    • by roman_mir ( 125474 ) on Saturday February 14, 2004 @12:06PM (#8280158) Homepage Journal
      (I wrote this yesterday and tried to post it as an article on /., but apparently there are so many more interesting and better written articles posted on the front page here that mine did not meet the qualifications to be posted. Or maybe it is just so off-topic and does not represent any real new ideas or news for nerds, you know, no stuff that matters is expressed in it, so don't read it.) I am sure that all of you would agree that the free software community has been facing some bad publicity since the entire SCO incident started about a year ago. I am also sure that when the SCO goes away another publicity stunt will be performed by some other corporation or an entity that could potentially cause more trouble. An earlier article [slashdot.org] on /. reminded us that there are other dangers that could stall the development of free software projects - an illegally distributed application source base can become the next battlefield for the free source community. Whether this source code could be distributed with an intent to contaminate is not the issue, the issue is that it is important to convey the message to the public that this community does not want to contaminate its source code with proprietary software. We know that the Linux kernel for example is maintained by a group of people who would never want to be faced with the problem of proving in the court of law that their creation is really their own code. What about other projects? How many lawsuits are comming towards this community? I do not know that. But I understand that some preventative measures should be taken, some measures that will clearly display that this community wants free software and free software will not be stolen from other source bases.

      How can this be ensured and how can it be easily shown in a court of law that this community takes copyright issues seriously? One way that I see is to set up a server that runs the comparator [catb.org] by ESR against any new submission to any open source project against any code released either by mistake on with malice by a closed source vendor.

      This will help to identify copyright problems before they arise. Of course to have a proprietary source code base on this server would probably be illegal in itself but it is unnecessary to have the proprietary source code, all that is needed is a set of hash-keys that identify that source code.

      How could this work? A copyright protection server (CPS) would have hash-keys supplied by different vendors of software that falls into various categories and the free software projects are also divided into these categories. Let's say there is a free software project that deals with image manipulations. The CPS would run a hash-key generator on the new code submission and then would compare the generated keys with the keys supplied by Adobe or other companies specialized in image manipulations. Of-course the closed source companies would have to run the hash-key generators on their code and supply their keys, and someone has to tell them to do that, but if it is done right then the following would happen:

      1. The Free Software community would have better protection from inappropriate code submissions.
      2. This can be publicised and shown that the Free Software community takes their work seriously and goes to the great length, much more than any corporations to make sure that their code is Free and free of inappropriate submissions.
      3. In a court of law this can be very useful, it shows good faith on the part of the free software community.
      4. This would make it easier to also figure out whether the closed source vendors are misusing GPLed software :)
      5. This makes a nice project that can be commercialized (with all the lates IP propaganda and lawsuites.)
      6. This hopefully will prevent many possible infringement claims.

      Well, this is just a thought, but I think this kind of verification will become part of reality at some point in the future, given more lawsuites.

      Any thoughts, comments, suggestions, ideas?

      • by Pharmboy ( 216950 ) on Saturday February 14, 2004 @08:24PM (#8283151) Journal
        While I agree with your logic, my only question is: What is in it for the proprietary software companies? Why would they produce hashes that protect open source projects, when open source projects could put them out of business?

        I would not be expecting them to cooperate with this. This sets up Microsoft to sue, just like SCO, for any kernel after 2.6. The difference is they have the money to sway the opinions of average persons who are not nerds. And they outnumber us 10-1. They may not looking for a knockout punch, they may be looking for a long, dirty slog.

        What if they were trying to do this:

        Instigate a problem with SCO and Linux, offer a large chunk of change to license some unlimited unix rights, but all they wanted was the unlimited rights, so they couldn't ever be sued. What if they are working on a BSD project that is closed source at the OS level, but runs all the free stuff they won't have to support. They put a XP like desktop on it using their own proprietary APIs, and make updates about as easy as their current windowsupdate program. And you can update in a console as well. What if.

        Now, I'm not ready for a tinfoil hat, but I can't help but to wonder. They have more experience with SCO Unix than anyone other than SCO (Xenix anyone?). They have used BSD code before, and still do (ftp.exe). They are the largest software company in the world, extremely profitable and have access to resources we can only dream of. And they are still hungry.

        This is why I have my doubts about companies providing hashes to help open source authors.
  • by maharg ( 182366 ) on Saturday February 14, 2004 @06:07AM (#8278729) Homepage Journal
    .. one example of which is This will happen because the open source model, which lets anyone modify source code and sell or distribute the results, virtually guarantees that someone, somewhere, will insert malicious code into the source. Yes as we all know, *anyone* is free to modify the source code, and then sell or distribute it, and we're all such trusting souls. Only this morning I chmod +x'ed and executed a binary (as root) which I had earlier accepted from a kindly stranger. More FUD methinks..
    • by cperciva ( 102828 ) on Saturday February 14, 2004 @06:17AM (#8278753) Homepage
      and we're all such trusting souls

      I'm providing binary security updates for FreeBSD. The Project publishes source code patches (and adds them into the CVS tree); I take those and build binaries, in order to help people who cannot or don't want to build updated binaries themselves.

      Thousands of people have used updates I've built; nobody has ever emailed to ask "who are you, and why should I trust you?"

      We may not be *all* such trusting souls, but there are an awful lot of trusting souls out there.
      • by maharg ( 182366 ) on Saturday February 14, 2004 @06:30AM (#8278789) Homepage Journal
        Yes, there are millions of trusting souls out there who (if they have even considered the issue) perceive themselves to not have any *choice* but to trust the Microsoft Corporation. Your site appears to be reputable, and you presumably have nothing to gain by publishing malware. I think you have to some degree missed the point of the article, which talked about high security applications of computing, such as national security et al. To say that trusting a single corporation which will not let you show you the "ingredients" is more secure than having a choice of sources, compilers and so on is naive, at best IMO.
      • by Tony-A ( 29931 ) on Saturday February 14, 2004 @07:02AM (#8278865)
        Thousands of people have used updates I've built; nobody has ever emailed to ask "who are you, and why should I trust you?"

        Sure you could do something nefarious, but why would you? Seems like somehow you'd have a lot more to lose than to gain.

        Since you have no control over, and not much knowledge of who downloads what when, it seems utterly fantastic that you'd use those binaries to target your enemies.

        Somebody compiles his own binaries. It should be fairly normal for him to download your binaries and see how his stacks up against yours. If there's something strange about yours, he's likely to try to find out what and why and unlikely to keep quiet if he finds any evidence of something wrong.

        It's not that I trust you or don't trust you. I'm sure that I can trust you a lot more than I need to trust you. If I have to ask why I should trust you then I probably should not trust you. Either way, I don't ask. If I did ask, I no idea of any answer you could give that would cause me to trust you. It's more like I'd trust you because the binaries are there than that I'd trust the binaries because I trust you.

        • It's not that I trust you or don't trust you. I'm sure that I can trust you a lot more than I need to trust you. If I have to ask why I should trust you then I probably should not trust you. Either way, I don't ask. If I did ask, I no idea of any answer you could give that would cause me to trust you. It's more like I'd trust you because the binaries are there than that I'd trust the binaries because I trust you.

          Geez. I was able to follow what you said until this part. Now I'm feeling dizzy.

        • On the other hand (Score:3, Insightful)

          by Benm78 ( 646948 )
          I think the parent has a point. It would be quite easy to exploit people that trust your binaries. When they download a precompiled binary from your system and install it, they could actually install a very big backdoor on their system.

          To make things worse, the one that offers the malicious binaries can easily log from which IP's they where downloaded. Many people will download directly to their server using wget, and then install the binaries.

          If people then omit to verify the integrity of the binaries

        • by KoolDude ( 614134 ) on Saturday February 14, 2004 @08:24AM (#8279064)

          Thousands of people have used updates I've built; nobody has ever emailed to ask "who are you, and why should I trust you?"

          Sure you could do something nefarious, but why would you?


          Moreover, wouldn't a criminal be more willing to do something nefarious if the source was closed rather than if it was open ? I think open source programs are inherently more secure from criminal acts because the risks of getting caught are much higher in open source programs due to the constant peer-review process.

          • by Anonymous Coward on Saturday February 14, 2004 @09:23AM (#8279260)
            You've apparently never been a virus author or cracker or dealt with cleaning up a business site after them, have you? They break in because they *can*, partly as a proof of their "genius" or because they want to steal resources (such as big bandwidth and FTP space) for their own use.

            These jerks can, and do, break into developer's home machines and business machines and steal or modify code to plant bugs. The wonderful thing about open source is the open code review *finds* these damn things, and the huge variety of source repositories and approaches to checking them makes it almost impossible to slip in a back door un-detected. And the openness of the user community gets the warning out to the rest of us extremely quickly, rather than the typical corporate software problem where it gets described to the vendor and ignored for many months or even years until it starts being actively used for a wide-scale virus.

            Unfortunately, the closed source also frightens people away from using patches to closed source software, because you can't verify what else was patched and it *does* often break core programs. So avoiding patches becomes corporate policy to protect the stability of your servers, as opposed to correcting issues when they are discovered.

            And security issues *will* be discovered. No system as complex as a large-scale web server or mail-server can be created entirely without bugs.
        • by Anonymous Coward
          It's not that I trust you or don't trust you. I'm sure that I can trust you a lot more than I need to trust you. If I have to ask why I should trust you then I probably should not trust you. Either way, I don't ask. If I did ask, I no idea of any answer you could give that would cause me to trust you. It's more like I'd trust you because the binaries are there than that I'd trust the binaries because I trust you.

          ...so I can clearly not choose the wine in front of you!
      • by I confirm I'm not a ( 720413 ) on Saturday February 14, 2004 @07:14AM (#8278895) Journal

        The impression I formed from the DevX article was that it was aimed at government (and I suppose you could article that that might influence large corporations, too).

        In my experience government and corporate IT admins are *not* trusting souls. As an example, I once worked as a contractor for an agency that built software for the UK health service: everything I built was then reviewed and recompiled by in-house staff. The manager told me that they preferred open-source precisely because of the ability to review source code. Cost was only a secondary factor.

        The same manager also commented that security-through-obscurity - relying on closed-source to deter evil-doers - was not an acceptable option as it placed to much reliance on third-parties.

        • by kfg ( 145172 ) on Saturday February 14, 2004 @09:07AM (#8279191)
          If I may be so bold as to quote from the Armadillo Book as to how to go about using Open Source code with minimal risk:

          Always build the program from source code. Don't even consider running pre-compiled binaries.

          This is just one item on a long list of how to build secure code.

          Other items include:

          Look over the source code to as great an extent as you can. . .

          Examine the archive before unpacking it. . .

          Examine the objects created by the build process with the strings command. . .

          There's no need for grandma to go through all of this, but in any situation where security is an issue you'd have to be pretty daft to simply trust a compiled binary. Especially if you're a government agency handling sensitive data. . .and especially if that compiled binary came from outside your national borders and stores it's files in binary form.

          If you're a French diplomat using MS Word to write sensitive missives back home you're just begging for the CIA to to pour over the hidden information in the binary of your document.

          KFG

      • I think they trust each other, not you. They trust that if you slipped a mickey into the code, it would come out. Then you would be branded. They trust the system of communication that is the internet. I used precompiled binaries on occasion for difficult projects such as Multimedia ones on Linux. I'm suspicious too.

        I'm quite a bit less suspicious when using software from RedHat. Though I emailed them twice about their up2date upgrade downloads md5sum not matching the binaries, on some of their down
      • Are any of these people working for a high-security governmental organisation? The assertion made in the original derrogatory article was that dubious binaries would be run by governments.
      • by kfg ( 145172 ) on Saturday February 14, 2004 @08:36AM (#8279105)
        You mistake the point of the original criticism.

        His suggestion was that someone supplying code to a single entity could corrupt that code, making it differ from that publicly available, thus the "many eyes" wouldn't catch the difference between the code with the back door and the code from a public site. The code was never publicly distributed at all. It was "fake" open source.

        You create publicly available code, so while you may put in a back door there are still many, generalized, eyes that have a chance to see it and raise the alarm.

        The scenario has nothing to do with simply corrupting an ftp site.

        And of course, the solution for a government concerned with issues of national security is to always build from audited source taken from multiple public download sites and checked against each other.

        This doesn't ensure that you won't get nailed by corrupt code, only that every one in the world gets nailed by the same code and so the "many eyes" argument of Open Source security applies.

        The backdoor gets found and patched.

        KFG
      • by fizbin ( 2046 ) <martin@ s n o w p l o w . org> on Saturday February 14, 2004 @09:58AM (#8279416) Homepage
        There's this guy I know from college who's written a free (as in beer) game for Windows. (Maybe you've heard of it [jardinains.com]; he also spends too much time on slashdot [slashdot.org]). Tens of thousands of windows users have downloaded it (according to webserver logs) and (presumably) run it on their machines, almost all of them (presumably) while logged in as administrator or equivalent. (At one point, it got farked [fark.com], and is still getting referer hits from there)

        Sure, you've found a patch of very trusting FreeBSD users. However, I'll bet that this one stupid windows game is downloaded and run with full privleges with no safety checks at all by a hundred times as many people.
      • Well, I think there are other trust models that the Open Source movement can invoke. I trust that the treatment options given to me by my doctors are safe because the studies documenting those treatments are published for peer review. I trust my doctor even more if the treatment has been on the market for a few years.

        It seems interesting that nobody argues that X-ray radiography would be safer if the methods for producing an X-ray radiograph were trade secrets held by individual companies.
      • Signatures (Score:3, Insightful)

        by xant ( 99438 )
        I'm not sure how FreeBSD does it, but I know how Debian does it, and the fact that those people can find out what your email address is implies that the binaries you provide are cryptographically signed. This means that you are responsible for their integrity. You could certainly insert a big backdoor, but once they found out, they'd know who did it! They don't ask who you are because they don't need to know; you're the guy who's gonna get crucified if there's a problem.

        A lot of large closed-source soft
    • Well, can you trust the contributors? Can you trust the entire core team? What if someone participates with a fake identity also, and uses the legit and the fake identity to insert exploits, with the legit ID saying "I've checked out his patch, it seems ok", occasionally fixing a trivial error etc. Maybe not very likely, but it's definitely possible.

      The whole "Many eyes makes the problem shallow" only works if everyone is equally skilled, and hopefully as skilled as the potential exploit creator. There's a
      • by Tony-A ( 29931 ) on Saturday February 14, 2004 @07:46AM (#8278964)
        The whole "Many eyes makes the problem shallow" only works if everyone is equally skilled

        Totally wrong.

        The advantage of many eyes is that they are different eyes. The problem is only visible if it is viewed from the right angle, in the right lighting, etc. The skill sets required to identify that a bug exists, to identify what the bug is, and to actually fix the bug are all very different.
    • Missing the point (Score:3, Insightful)

      by starshot ( 750940 )
      He totally misses the point of open source. He's focusing on its freeness. The fact that something is free, put together by a team of coders who devote their time to the project solely because of their loyalty and love for it, does not mean that it is lower quality.

      This will happen because the open source model, which lets anyone modify source code and sell or distribute the results, virtually guarantees that someone, somewhere, will insert malicious code into the source.

      The open source model also guar

    • Grapes (Score:3, Insightful)

      by paiute ( 550198 )
      This will happen because the open bag model, which lets anyone look into the bag and sell or distribute the grapes, virtually guarantees that someone, somewhere, will insert spiders into the grapes.

      If you can see the grapes and the bag is transparent, then any spiders can be removed. If the grapes are sold in a can and you have to eat them in the dark, you might swallow a spider.

      I don't want the (original) author to be shopping for my produce.

    • by Hooya ( 518216 )
      i didn't read the article. is the devX author talking about the NSA backdoor in windows [slashdot.org]? he's totally right. we cannot trust a vendor. the vendor could have inserted backdoors, hyper-eastereggs. who would know? the guy is right on. vendors cannot be trusted. i mean, every 'update' and a 'security fix' could in fact be a wider backdoor for the NSA to tap into. how would you test it? i mean, come on. how do you know that your personal information isn't being submitted under the guise of "in order to improve t
  • by Anonymous Coward on Saturday February 14, 2004 @06:08AM (#8278730)
    The responder's best point is the last; if you trust software from some unknown project or company, who knows what you're getting. But trusting in major players, such as Apache, you can be at least as sure (if not more so) that you're getting good, stable, secure software as anything shipped from Redmond.
    • by tigress ( 48157 ) <rot13.fcnzgenc03@8in.net> on Saturday February 14, 2004 @06:35AM (#8278803)
      Playing the devil's advocate here, you can trust source from Apache yes, but can you trust a precompiled Apache HTTPD from ACME GPU/Linxu?
      • by thelen ( 208445 ) on Saturday February 14, 2004 @07:04AM (#8278872) Homepage

        can you trust a precompiled Apache HTTPD from ACME GPU/Linxu

        Nope, but you also cannot trust Thugs R' Us Locksmiths.

        OSS commoditizes software: it devalues code in exchange for freedom of collaboration, the ability to build on others' successes, probably a greater amount of software overall, and I would argue, a faster development cycle. The author of the original article apparently thinks that this is a detriment because it makes it easy to start a malicious company like ACME GPU/Linxu to sell a forked open source product with intentional security holes.

        But we're used to this problem in other industries where products become commonly available and people can form their own businesses utilizing those commodities. And while there *are* scams, most of us accept that we need to exercise judgment in whom we trust. Anyone can go out and buy locksmithing equipment, but if you skip over a known, reputable and trusted vendor in favor of the cheaper 'Thugs' alternative, you get what deserve: a lock with more keys than you know about.

        • Nope, but you also cannot trust Thugs R' Us Locksmiths.

          Actually, most locksmiths are bonded and advertise their bonded status. This provides stronger incentives for honesty than for breaking into your house.
      • by gweihir ( 88907 ) on Saturday February 14, 2004 @07:06AM (#8278879)
        ...but can you trust a precompiled Apache HTTPD from ACME GNU/Linxu?

        Not strictly. Yes, you can assume if ACME has a long enough and knowen history that they are honorable. No, there could still be backdoors in there.

        But you know what? You can get part or all of your distro from somebody else! And since it is GNU, if somebody claims ACME has backdoors you can check this in the source (if it is there) or compile from their source (if it is not there).

        That is actually a major advantage for compilable open source: Patches can be source patches and you can see and verify yourself what the vulnerability was and how well it was fixed. In addition you can fix things that are not exactly matching the patch. I, for example, run Debian with self-compiled xfree 4.3.9x (Radeon 96000XT). The published patch for the recent font-related buffer overflow does not apply to the sources cleanly. But it is very easy to see what the patch does and to change the sources accordingly. Took me about 20 Minutes (+recompile) to patch it manually.

        With closed-source patches you never know whether they are actually fixing the problem or whether they also do other stuff. All the fake "MS-patches" in Email also show that it is a good thing when people can verify what the patches do. And it gives strong motivation to come up with a minimal, elegant patch> as well, since people can see it!
      • by no longer myself ( 741142 ) on Saturday February 14, 2004 @07:11AM (#8278888)
        It's both a valid and interesting point, but how many times do we have to keep second-guessing ourselves over the security of software? In general, it boils down to "who do you trust?"

        In my case, I see it as, "Do I want to trust a company who's only interest is in generating a profit, or do I want to trust the broader base of humanity who wants to create an open and free system?"

        Admittedly I've got a tin-foil hat collection to rival any slashdotter, so I'll try to advocate the devil as well with "Do I want to trust some band of amaturish zealots who lack a clear and unified mission statement, or do I want to trust a company that has shown an exceptional degree of responsibility by haveing a track record in producing enormous profits?"

        Obviously both have appealing merits. So "who do you want to trust today?" (TM)

        We all have our heroes into which we place our faith, and nobody likes to be let down by a hero. For some it's the almighty dollar, for others it's their faith that deep down, humanity tends to be good.

        --
        Yes, I'm biased.

      • This _already happens_ in the world of closed source software. This is basically what mal/spyware is. It's totally true that open source doesn't prevent this, but neither does anything else. It's a straw man argument.

        On a side note, it's happened with OSS, too - some enterprising asshole packaged the open source CDex ripper into an installer loaded with spyware.

      • Since you can *always* download the code from the Apache group's servers why would you ever need to trust this ACME company?
  • by heironymouscoward ( 683461 ) <heironymouscowardNO@SPAMyahoo.com> on Saturday February 14, 2004 @06:09AM (#8278732) Journal
    Heironymous' Prime Law of Journalism:

    Opions are valued in inverse relation to the amount of money paid to produce them.

    In this case, the opinion that transparency is bad for security is of so little value that it's difficult to answer it with a serious tone.

    After all, Windows is remarkable for its security wrt to something like, OpenBSD, known for its secretive and opaque practices.

    lol.
    • Let us also not forget that windows "leaks" have occured recently. And remember last year when their was question about the code being infultrated? Leaks can go both ways.

      I like the ability to personally verify any rumors I hear about the code, or pay someone else too. OSS offers this, cloded source does not.

      Also, when you have the illusion of security, you tend to be less diligent. I argue OSS has stronger code checks for major projects because of the nature of the code. For instance, the Linux kern
  • by darnok ( 650458 ) on Saturday February 14, 2004 @06:14AM (#8278742)
    Now that the MS source for NT 4 and Win2k is "out there", even if only in part, we'll have a good chance to see exactly how secure it is over the next several months.

    Anyone want to bet that the number of exploited Windows security holes is NOT gonna soar?
    • by Black Parrot ( 19622 ) on Saturday February 14, 2004 @06:54AM (#8278847)


      > Now that the MS source for NT 4 and Win2k is "out there"

      Which suggests the argument that even if your code isn't "Open Source" it may still be "open source", so even if source availability is a security handicap, the field may still be more level than closed source shops would like to think.

    • Windows is already hackable and riddled with security holes. How many barn doors there are isn't going to change the number of chickens that escape.
      The limit of security is not a technical one, it is a human one: how many sociopaths bent on destruction of innocent bystanders are there. No doubt there are a few, and no doubt the network nature of internet gives them leverage disporportional to their numbers, however more ways of commiting the same heinous hacks isn't going to make much impact on their influ
    • by gweihir ( 88907 ) on Saturday February 14, 2004 @07:17AM (#8278902)
      Now that the MS source for NT 4 and Win2k is "out there", even if only in part, we'll have a good chance to see exactly how secure it is over the next several months.

      To tell you the truth, I am not interested. Why should I look at parts of a badly structured, feature infested, bug infested monolith of an OS? When I can at the same time find out how to do it right by looking at the sources of the Linux kernel or one of the open sourced BSD's? Why would I actually want to read bad code?

      True, some people will actually spend the time to find vulnerabilities. Some of them (especially those in military and commercial espionage) will not publish what they find. But I suspect these people already had this kind of access before. And the usual script-baby loosers do not have the competences to understand the sources anyway.

      One thing could happen though: Too many published and still current vulnerabilities for MS to fix. Or even worse, vulnerabilities they cannot fix because they made bad design decisions. Will be interesting to watch.
      • Why should I look at parts of a badly structured, feature infested, bug infested monolith of an OS?

        When I ran out of gas over on 520 and found myself walking down 156th Ave NE in Redmond, I asked myself this same question. The answer, right there in the heart of Microsoft, presented itself. Some well-dressed, clean cut dude came out with a CD and said "He's the source for Windows XP." I said "What the fuck am I gonna do with that?" You know what he said?

        "You'll learn how not to write code."

        Part of m

    • Anyone want to bet that the number of exploited Windows security holes is NOT gonna soar?

      Yeah, I'd take that bet.
      For baseline, there is a trend going back to Melissa that indicates an ever increasing level of malware. "soar" is above that baseline.

      The bad guys have every reason to use the newly exposed source.
      The good guys have every reason to avoid the newly exposed source.
      Still there should be a few cheap hacks so that my computer does what I wand it to do instead of what Microsoft wants it to do.
      My be
  • Huh? (Score:5, Insightful)

    by Dan Farina ( 711066 ) on Saturday February 14, 2004 @06:15AM (#8278747)
    I fail to see how his logic works.

    Because I can view the source code and change the source code, I can introduce a flaw. Yet it would be far less likely for a for-profit closed source project to be swayed by some sort of ulterior motive to include a flaw, because we have seen exactly how ethical and steadfast corporations are in this modern day and age.

    It seems that he doesn't acknowledge that the aspect that makes open source secure is that it's hard to have a unified, systematic, malevolent agenda due to the extensive peer review inherit in the system. People who have different agendas or motives than you will be viewing your changes.

    While his hypothesized scenario is certainly possible, I wouldn't go so far as to say it is a bane.
    • Re:Huh? (Score:3, Interesting)

      What about like what happened when the source tree was compromised and someone added a line of code that didn't look all that bad until further investigation when it gave programs root access? I remember they asked for MD5 sums and they were able to track down the root of the problem, but what if someone better was able to modify something on a system such as that without notice?
      • Re:Huh? (Score:5, Insightful)

        by thelen ( 208445 ) on Saturday February 14, 2004 @06:49AM (#8278829) Homepage

        That's a different problem than the one suggested by the original -- and badly misguided -- article. In the case you mention, a security breach allowed unauthorized alterations to the codebase. And of course after any such intrusion a full code review is a necessity regardless of your development model.

        The argument presented though is predicated on the "core developers" of a project intentionally creating a secret fork of the source containing security holes and using that compromised branch to build binaries. Of course this threat is equally if not more likely to occur in closed source products, and so the author presses his case with the scenario of a no-name company being formed to sell compromised open source products. Somehow we're asked to believe that the virtue of OSS -- the ability to build off of others' work -- is actually a security liability because of the ease of creating a malicious startup. Never mind that any IT manager who chooses to use the binaries from an unknown software vendor, especially if verifiably pure source is available, is clearly being negligent.

  • by file-exists-p ( 681756 ) on Saturday February 14, 2004 @06:22AM (#8278763)
    There is no doubt it may help someone to break into your system if he has the source code or your OS and various deamons. Fortunately, when it's open-source, we can hope bugs allowing bad guys to break in may have been spotted by nice guys before and patched.

    The real problem would be if only bad guys had your source code .... that would really suck. If for instance there was a leak of your source code on the internet, and of course only bad guys would look at it (because others do not give a shit) and thus you would get only the bad part of the opennes ...

    Yeah, that would suck. That would really suck.

    --
    Go Debian!!!
  • Looks like... (Score:5, Insightful)

    by deitel99 ( 533532 ) on Saturday February 14, 2004 @06:23AM (#8278766)
    Slashdot is feeding the troll. Just because the original article claims to be a balanced warning into OSS, a little research shows all his points to be wrong.

    Just another journalist trying to make a story people - move along.
  • by FauxReal ( 653820 ) on Saturday February 14, 2004 @06:23AM (#8278767)
    Open Source Is Fertile Ground for Foul Play Average Rating: 1.2/5

    The rebuttal "Who's Guarding the Guards? We Are" [devx.com] , also hosted at devx. Average Rating: 4.9/5

    • by Anonymous Coward
      and the funny thing is that the first (anti-open source) article was written by the Executive Editor of DevX, and this rebuttal was written by "a Senior Engineer at DevX"
  • this is pathetic (Score:5, Insightful)

    by pytheron ( 443963 ) on Saturday February 14, 2004 @06:23AM (#8278769) Homepage
    There are a handful of ways that malicious code can make its way into open source and avoid detection during security testing


    Let's see.. the most (un)likely way is that someone hacks a host server, mods the code and then updates the MD5 sums. Stupid. All major Open Source software know how to protect their codebases by holding offline checksums and isolated codebases. This is too unrealistic to happen these days, if you actually care about verifying what you just downloaded and are about to compile.


    Instead, the security breach will be placed into the open source software from inside, by someone working on the project.


    Laughable. Aboslutely ridiculous !! Can this not happen in closed source environments ? A disgruntled employee perhaps ? I'm sure the article writer would say "but there is quality control, peer review.." I suppose that never happens in Open Source.. I mean, how can we actually review the code when it's publicly available. Oh, that's right.. we can. Open Source peer review is brutal at the best of times !

  • He might be right... (Score:2, Interesting)

    by kyshtock ( 608605 )
    I believe he's right... if he means proprietary source code that finally goes in the wild. The moment code opens, troubles are waiting to happen. If some recent events ring a bell, that's not my fault :)

    On the other hand, if he means code that's been built openly... damn, what's better than having the software AND the source code for inspection? how do you beat that?

  • by pytheron ( 443963 ) on Saturday February 14, 2004 @06:28AM (#8278783) Homepage
    Unfortunately, the model breaks down as soon as the core group involved in a project or distribution decides to corrupt the source, because they simply won't make the corrupted version public.


    So.... it's not Open Source then. Way to let the hot air out of your puffed-up argument.

  • by SmallFurryCreature ( 593017 ) on Saturday February 14, 2004 @06:29AM (#8278787) Journal
    I saw the post on this idiots article right below the post on the MS source leak.

    So GNU/Linux source has been out for decades. Windows source has never been out except recently. Shall we do an exploits in the wild count? Note the in the wild part. It is a distinction that anti-virus researchers make as their are some pretty nasty computer virusses that have only been spotted in their labs, not on peoples pc's.

    Every now and then some idiot is going to stand up and proclaim something really stupid. Instead of gently leading that person to proper care and attention in the form of a straight jacket and handfull of pills people print their ravings.

    This guy is one of them. Opensource vs closed source means very little when it comes to security. Big holes can and have been found in both. What matters is how you respond to those holes. Opensource GNU/Linux is pretty fast. Closed source Microsoft is goddamn slow. So? MS is hardly the only closed source company. If someone ever post figures on the commercial unixes or OS's like symbian and shows the same terrible performance as MS then I will be impressed.

    So far all the MS exploits prove is that they have some pretty sloppy working methods in redmond. Not that closed source itself is bad. If all closed source projects have the same track record as MS then it will be news. They don't.

    HOWEVER, opensource has proven itself. Countless projects use it, linux kernel, gnu toolset, kde and gnome and all the other desktops, tron the os blueprint from japan, apache, mysql and postgress and the berkely databases, bsd even though it is dying and countless others.

  • by tigress ( 48157 ) <rot13.fcnzgenc03@8in.net> on Saturday February 14, 2004 @06:31AM (#8278794)
    I was recently involved in a project where a large Swedish car manufacturer migrated to a corporate wide client platform. The operating system was supplied by a major American software company, packaged by a major American computer manufacturer, reviewed and further packaged by the car manufacturer's mother company and finally tailored for local requirements by one of our teams.

    At any one of those stages, a hacked binary could've been introduced into the operating system. To modify a binary, even without access to the source code for said binary, is a trivial task for anyone with a rudimentary knowledge of assembler.

    Proprietary code does not, in any way, prevent malicious code from entering the system. One of the points in the original article was that a malicious distribution could be specifically tailored for and marketed to, for instance, a government. My example above shows how a proprietary code operating system can be used in a similar way, and this time without any source code to check against.
    • To modify a binary, even without access to the source code for said binary, is a trivial task for anyone with a rudimentary knowledge of assembler.

      And closed source makes it trivial to keep anyone else from knowing that the binary has been modified. Anyone along the line can inject a backdoor or trojan.

      It will be interesting to see how Microsoft fares with some of their source gone public. There is a trend dating back to Melissa that suggests an ever increasing level of malware. My own prediction is that
  • by Anonymous Coward on Saturday February 14, 2004 @06:31AM (#8278796)
    First off, Malicious hackers have day jobs.

    Lots of times they are professional programmers that like to play "games" on the weekends and in the evening.

    MS's source code is like a prostitute. It's gets around and around to whoever has the money to afford it. To say that it never fell into the hands of a "bad man" even thru legitamate means is foolish.

    People spend months and months researching and setting up specific attacks. Sometimes the stakes are worth hundreds of thousands of dollars when it comes to corporate espinoge and trade secrets.

    Now most hardcore hackers even if they do have access to the source code definately isn't going to advertise it on warez sites and post their findings on slashdot. Their time is worth money/fame/insane pride to them too.

    This latest release of the windows source to warez-style groups is definately NOT the first or the last time the source code to your programs is aviable to people you don't trust.

    In Open source:
    The developers have the source. The crackers have the source. YOU have the source.

    In Closed source:

    The developers have the source. The crackers have at least partial access to the source. Your screwed.

    It may be a subtle difference, but also think about this:

    How many discruntled employees piss in their bosses coffee? Or at least spit? Or use stale water(If they are pussies)?

    Now how many programmers are entirely "there"?

    Do you want your application to be the pissing ground for angry employees? Can you tell?

    No of course not, their have been plenty of cases of otherwise perfectly good programs having security holes and backdoors planted in them by programmers.

    You think it's going to stop because Bill Gates says it isn't so?
  • by real_smiff ( 611054 ) on Saturday February 14, 2004 @06:33AM (#8278799)
    An old adage that governments would be well-served to heed is: You get what you pay for
    right next story. (anyone who starts with an outdated & meaningless saying is not going to have anything valuable or new to say. we all have better things to do than entertain this rubbish).

    and /., can you stop reporting this, it's basically one huge troll & it only encourages people like him.

    btw Mr. Jones, the choice isn't open vs. closed, it's open vs. possibly leaked. yah. nice. please go away.

  • by Anonymous Coward on Saturday February 14, 2004 @06:37AM (#8278807)
    The guy has a trimmed beard ! a trimmed beard!! No open source has ever touched him, or his facial hair would be reaching for the keyboard !
  • by Gadzinka ( 256729 ) <rrw@hell.pl> on Saturday February 14, 2004 @06:47AM (#8278826) Journal
    As previously discussed on /. Jones' comments are too controversial to ignore.

    On the contrary, this type of comments are the ones you have to ignore. It is simply mindless, fact defying -1 troll.

    I mean, when you see after a quick glance that author obviously did the research and ignored all the facts that didn't support his thesis, there's nothing you can tell him that will make him apologise, admit to mistake or sth like this.

    When you see additional rhetorical manipulations (e.g. things that are insinuated but not stated straight, guilt by assosiation, or proof by analogy) you already know, that the point of the article was purposeful manipulation.

    For some people operating systems, computer vendors, open vs close source, GPL vs BSD are religious matters and you don't want to get into discussing beliefs with religious fanatic.

    Robert
    • Still worthwhile. (Score:3, Insightful)

      by Denyer ( 717613 )
      The writer of the article may never recant, but he can be highlighted as being an ignorant fool by a calm, intelligent rebuttal.

      It's worth supporting things you believe in when the alternative is to let lies and FUD spread uncontested. It's particularly worthwhile for the benefit of those in the slightly wider audience who aren't generally informed about tech matters, and who might otherwise be swayed by rhetoric.

  • In other words, people will get it in their own. It is easy for a casual observer to train him/herself up on the facts and make their own judgement about whether security efforts have gone into OSS, and whether they will pay off. Somebody just saying "ooh, watch out" might give them pause -- but they can experience it for themselves.

    The facts will (or will not) speak for themselves.
  • My rebuttal :) (Score:5, Insightful)

    by fucksl4shd0t ( 630000 ) on Saturday February 14, 2004 @06:56AM (#8278851) Homepage Journal

    I realize I'm preaching to the choir, but here goes:

    So far, major Linux distributions such as Debian and others have been able to discover and remedy attacks on their core source-code servers. The distributions point to the fact that they discovered and openly discussed these breaches as evidence that their security measures work. Call me paranoid, but such attacks, however well handled, serve to raise the question of whether other such attacks have been more successful (in other words, undiscovered).

    And do closed-source companies that sell server software of any kind advertise when they themselves get breached? He raises the question of other undiscovered attacks, but he forgets to point out that Debian discussed its attack publicly because part of the open source model is "open". This same shit happens to closed source companies, they just don't tell anyone about it. The real question here isn't whether or not Debian was breached in undiscovered fashion. It's whether or not we'd even know if a closed organization was breached, and his question of the purity of the source code is even more pertinent to a closed organization than to an open one. That's what 'open' is all about.

    Therefore, security problems for governments begin with knowing which distributions they can trust.

    Security problems for governments exist because of negligence, for the most part. More below.

    This (hopefully potential) problem isn't limited to open source software, but open source certainly has far fewer inherent barriers than commercial software. The easier it is to access the source code, alter it, and then recompile it for custom uses, the more likely that it will happen--and then you have no security. Any security checks performed on the software before the source is delivered are invalid.

    Ok, he needs a lesson in reading comprehension, or he needs to hire a lawyer to interpret the GPL for him. Because as we all know, and love, the GPL requires that the source used to make the binary you have just distributed be made available to the person you gave it to. So let's say I fork RedHat and patch it with backdoors and crap. Then I sell it to, hmm, let's say the FBI, and they go to implement it. Since the FBI is well-known for security procedures (ha!), they decide they want to check the binary I gave them against the source I gave them. (Of course, I gave them the source without the patches) So they ask me what compiler I used, and what build tools I used, flags and so forth. I tell them. They compile the source I gave them and compare it to the binary, and I'm in trouble. I've committed copyright infringement, and we all know from years of FBI warnings what that means exactly. The simple fact is, he's trying to apply security policies that shouldn't be applied in an environment that requires the level of security he describes. What kind of FBI security policy would approve the use of open source without requiring it to be audited? Furthermore, what kind of government organization would purchase mission-critical software from a no-name company? Especially when there are a few reputable large companies available to give it to them.

    He ignores the GPL quite blatantly here, and that is the government's insurance that the binary they run will be as secure as they can make it.

    Open source software goes through rigorous security testing, but such testing serves only to test known outside threats. The fact that security holes continue to appear should be enough to deter governments from jumping on this bandwagon, but won't be. Worse though, I don't think that security testing can be made robust enough to protect against someone injecting dangerous code into the software from the inside--and inside, for open source, means anyone who cares to join the project or create their own distribution.

    MOst of this paragraph is doubly true about closed source companies because they are closed. An open company is subject

    • Anyone remember when the Windows Update servers got hit by Code Red? =)
  • by andih8u ( 639841 ) on Saturday February 14, 2004 @07:03AM (#8278866)
    It really doesn't matter if its open source or closed source. The weakest part of any system will always be the person attached to the keyboard.

    Blaster was a big problem because no one can be bothered to download a patch.
    The MS source code was leaked because no one could be bothered to download a patch.
  • Feeding trolls... (Score:4, Interesting)

    by yoshi_mon ( 172895 ) on Saturday February 14, 2004 @07:03AM (#8278869)
    To be quite honest I never gave that Dev X's troll any thought. But apparently /. seems to feel that this very poorly written piece of work deserves not one but two front page storys. So be it. (I sure hope to hell that OSDN is not getting any cash from those losers. It would really ruin my day.)

    Bottom line for me is that FUD is FUD is FUD is FUD. There are several ways to combat it and one of them is to just let those that want to FUD away while we continue to build, create, use, and accept that OSS is a good thing for everyone. Those with small minds are scared, good. I don't want those people involved with me and it makes me actually feel good when I see that they have to resort to such lies and FUD to try to defend what they see as "the only way".

    I read a comment here the other day about how someone viewed OSS OSes as the ultimate capitalist leveling field. By making not only the hardware but the base software, the OS, open you then allow everyone to create things as they wish and without any strings. They even can make them closed source if they so wish but the hooks, protocals, and standards are open such that you can make the software work correctly, regardless of platform.

    As has been sited here many times MS has not even given that freedom to it's programmers with it's lack of API documentation in addition to it's lack of standards (Unless you think that they are alone in being able to set them. Go away then you shrill.) and numerous changes in even their own types of file standards. (Why does MS Word docs have to change so often? Hello, forced upgrades.)

    I really could care less about such FUD from some lame ass website that I personally have never visisted or even heard of until reading the inital /. artical. They can go toil in obscurity imo and we are ill served by even giving them the time of day.
  • this is tiring (Score:5, Insightful)

    by CAIMLAS ( 41445 ) on Saturday February 14, 2004 @07:06AM (#8278880)
    It's like fighting a war where we simply re-win the same outpost over and over again, and never make progress. Why?

    Because the damned fools think that they're making a valid arguement when they're simply spitting out the same FUD over and over. Now, if they were to refute previously made refutations, further arguement can be made.

    However, that would require them to be able to find something to refute our arguements with. Esentially, "Your guns are too big, so we'll back down and make this point again later." Urg.
    • the elite agenda? (Score:4, Insightful)

      by Slur ( 61510 ) on Saturday February 14, 2004 @09:00AM (#8279169) Homepage Journal
      I feel your pain. What's worse is that none of these so-called writers ever seems to learn from their mistakes and publishes a retraction or a response. It makes you wonder if they really have any interest in journalism at all, or if they're just playing games.

      The thing is, the general public hears all these conflicting messages about open source. It doesn't generally matter what the public thinks because the government will probably develop its software policies unilaterally without any public review or input, just as it does with anything that actually matters. The government will of course choose proprietary solutions from Microsoft more often than not, simply because MS is an icon of the capitalist ethos, and people in government generally do not have the political will to do anything that might be construed as "anti-capitalist" (hence, anti-american).

      Public ignorance and confusion is a requisite condition for Government to follow its natural pathological course.

      Is it possible that these foolish, uninformed, and perhaps even deceptive writers are acting in the interest of MS simply out of their love for profit uber alles? Or are they simply mindless MS fanboys? Or is it possible that they really do believe that their assertions are true, that they're being objective and relaying accurate information? This sort of intellectual laziness is really sad.
      • by CAIMLAS ( 41445 )
        I think it's probably some perverse combination. I doubt that most of these writers have the foundation of thought necessary to rationalize a logical conclusion. Their impressions are more likely than not a cohesive enganglement of social status quo hubbub, pop culture appeal, and a despicable moral corruption.

        The only thing they're good at is contortion and deceit.
  • In theory the "many eyes" that can see open source will detect security problems. In practise it doesn't happen that way. The reason that open source code is more secure than closed source is that the designers and authors care more about their code as they KNOW it will be made public and they value their public reputation -- it's the same as a John Grisham making sure there are no speling errers in his books. Additionally in the Linux world they don't have to make security compromises suggested by some
  • The question (Score:2, Insightful)

    by Anonymous Coward
    The security question should not be:

    Closed or open source?

    It should be:

    Who do I want to trust? What project has a good reputation (OpenBSD maybe).

  • ...too stupid to ignore? Judas Priest on a pony, this is the same stuff that has been refuted time and time again.
  • by mattyrobinson69 ( 751521 ) on Saturday February 14, 2004 @07:34AM (#8278941)
    in light of what happened this week (NT4 & Win2k's source being leaked (therefore much of XP and longhorn), microsoft cant claim that their source isn't available to 'bad people' anymore. My friend downloaded the source himself a couple of days ago, i didn't have a look because to be honest, i dont care. Microsoft's source being available is far worse for security than linux/BSD etc source being available because microsoft chose "security through obscurity" - OSS OS's dont. Since NO Firewall/Virus scanner can prevent you from holes in services that are supposed to run (MSN Messenger for example [was that leaked?]) there's going to be some bad stuff happening this week to companies running windows. Hopefully, this will give them reason to choose a more secure platform next time they change software, instead of just upgrading to the latest windows.
  • by mrmdls ( 684047 )
    For those who want a great look at security, both in a closed source and open source OS, take a look at the March issue of Linux Magazine - Stephen J Vaughan-Nichols article on Security is a Process, not a Product. Mr. Vaughan-Nichols writes and quite correctly that security is every user's job, and that as Linux gains in popularity so does the threat of security concerns.
  • good response (Score:3, Insightful)

    by tacocat ( 527354 ) <tallison1 AT twmi DOT rr DOT com> on Saturday February 14, 2004 @07:44AM (#8278958)

    I'll skip the comments about how incorrect the original article is and leave it to the responses' comment about fundamental misconceptions of Open Source. But the response is really an excellent read, well thought out and showing an solid example of classical debate rebuttal.

    Kudos for writing an article that the same audience that will believe DevX would understand as well. Too often the repsonse to such articles is written to an entirely different audience and on such a technical plane that those who read, and believe, the first article are often times entirely incapable of understanding the second article. It's not their fault, they are not CSE types by any stretch.

  • "...Because anyone can create and market-or give away-a Linux distribution, there's also a reasonably high risk that someone will create a distribution specifically intended to subvert security. And how would anyone know?"

    I would know by viewing the source code [fsf.org].

  • by SharpFang ( 651121 ) on Saturday February 14, 2004 @08:10AM (#8279017) Homepage Journal
    Open source advocates rightfully maintain that the sheer number of eyes looking at the source tends to rapidly find and repair problems as well as inefficiencies--and that those same eyes would find and repair maliciously inserted code as well. Unfortunately, the model breaks down as soon as the core group involved in a project or distribution decides to corrupt the source, because they simply won't make the corrupted version public. Therefore, security problems for governments begin with knowing which distributions they can trust.

    GPL forces distributors to provide source code to their customer. Then the government is free to (and should) post the source to public audience. They can (and should, even for performance sake) recompile the binaries from the code provided. So...?

    I think this guy didn't read GPL.

  • Although I agree with the majority of the comments to this article I am glad that Mr Russell Jones wrote his article. Why ?

    One big problem that the open source community faces is that of complacency -- ie knowing that we are: better, more secure, ...

    What we know may well be true, but it will not remain true if we relax, content in the warm glow of our superiority. To remain ahead needs continuous awareness of the issues, which, in the case of security means a constant paranoia prompting reassessment of pr
  • by Anonymous Coward on Saturday February 14, 2004 @08:23AM (#8279061)
    and illustrated by one quote from the article:

    To limit their vulnerability, governments can't afford to give everyone a choice, nor can they afford to provide access to the source code for their software.

    This has been the age-old cry of dictators and despots everywhere: "We are restricting the rights and freedoms of the populace for their own good!"

    And it has never turned out to be true.
  • by nuggz ( 69912 ) on Saturday February 14, 2004 @08:31AM (#8279085) Homepage
    Lets see what 'security systems' are open source.
    Locks, keyed and combination, they still work well.
    DES, AES, Blowfish, all these algorithms are available, but the security isn't weaker because of it.
    Electronic tags that beep at the exit to a store, they still work.

    As long as it isn't a broken algorithm, or a password that is being shown, it shouldn't be a problem.
  • distributions will be created and advertised for free, or created with the express purpose of marketing them to governments at cut-rate pricing.

    So, who's going to to compete with slackware on price? Or debian? Or mandrake? Or fedora? This type of statement is just *weird*.
  • by mu-sly ( 632550 ) on Saturday February 14, 2004 @08:33AM (#8279093) Homepage Journal

    Hey, I just had a great idea!! If I form a company and deliberately write insecure, malicious code with backdoors in it, I could use it to control the governments of the world and become obscenely rich!

    Oh, wait... someone else [microsoft.com] has already done that, and most likely patented the idea. I don't want to get busted for patent infringement, man!

    Damn... back to the drawing board.

  • by Ricin ( 236107 ) on Saturday February 14, 2004 @09:16AM (#8279232)
    Governments are not stupid. They may in fact be a lot more knowledgeable than, say, some fruit who thinks he's a journalist writing populist drivel at a MS shil site.

    I'm sure they have some technically competent advisors. And then they have beancounters who make the very end decision cos in the end its all about the buck, not the bug.

    Both authors are merely preaching to their choirs, it won't impact any real govt decision.

  • by Dr. Blue ( 63477 ) on Saturday February 14, 2004 @09:59AM (#8279420)
    The notion that hiding the means of encryption will somehow make the data in question more secure is a notion that has been obsolete since World War II.

    This is too conservative.... it was in the 19th century that this became accepted. It's known as "Kerckhoff's Principle." From Wikipedia:


    In security engineering, Kerckhoffs' law (also called Kerckhoffs' assumption or Kerckhoffs' principle) was stated by Auguste Kerckhoffs in the 19th Century: A cryptosystem should be designed to be secure if everything is known about it except the key information. It was reformulated (perhaps independently) by Claude Shannon as "the enemy knows the system". In that form it is called Shannon's Maxim. Since the advent of open source software development, these principles have increasingly been used to ground arguments for it (and against "security through obscurity").

  • by Yuioup ( 452151 ) on Saturday February 14, 2004 @10:21AM (#8279538)
    Okay, here's my take on the situation:

    It's far easier for a hacker to write a worm if he has access to ALL the source code that powers the internet. He can exploit, say, Linux boxes that run Apache to spread a worm because he found a flaw in the source code.

    Yes sure, the flaw will be patched within days, hours or even minutes, but the damage will be done, albeit limited.

    A patch is usually made AFTER the exploit is found, not before. You'd have to have an amazing auditing system in place in order to make 100% secure code. In my opinion, writing 100% secure code is impossible.

    Microsoft tries to hide behind closed source hoping that by keeping the code closed nobody can easily detect a flaw and exploit it. The major problem with that philosophy is that the damage will be devastating were the code to be leaked...

    Open Source = limited damage
    Closed Source = ticking timebomb

    Yuioup
    • I call bullshit... (Score:3, Insightful)

      by Phil John ( 576633 )
      "A patch is usually made AFTER the exploit is found, not before."

      Most of the stuff I've been patching recently has been found before an exploit becomes known. The recent OpenSSH patches, a result of careful code auditing, most of the recent PHP errors, same again.

      Seems like a lot of worms get their "inspiration" from already posted security vulns and just rely on the fact that not everybody will patch them in time.
  • Uh... HELLO? (Score:4, Insightful)

    by black mariah ( 654971 ) on Saturday February 14, 2004 @11:01AM (#8279761)
    The point of the original article was that due to the open, free, and highly available nature of open source code that ANYONE could get it and fuck with it. Yes, it's just as likely that someone with fuck with closed code but that IS NOT THE POINT. The availability of open source code IS.

    If someone at Microsoft implants a backdoor into Windows XP and it goes out with the next update, it will be a matter of hours until they find, fire, and more than likely arrest the guy that did it. There are very few people working directly with Windows code than there are people working with Linux/open source code. While the possibility of someone installing a backdoor is still there, the risk associated with doing so in a closed enviroment is much higher because the probability of being caught is much higher.

    It is more likely that someone that wanted a way into your system would just, I don't know, hack a trojan into Gaim or something. Or even better, something with a large codebase. Open Office, Mozilla, and so on. All it would take is to package it as an RPM file then tell the core team you're packaging RPM's so they link to your site. Everyone that downloads that version has a nice gaping Goatse-style hole in their browser.

    No, it's not likely, but without a doubt the probability of something like this happening with open source software is much higher than it happening with closed source software. As an aside, I'm sick of seeing rebuttal articles that do nothing besides lick the balls of open source ideological diatribe while simultaneously calling the integrity of the original articles author into question. If you're going to use that absolutely inane logic, then nothing that RMS, ESR, or Linus says has one bit of integrity either. In some way, all of them make money from open source software, so why is their integrity not in doubt when they speak of open vs closed software? Don't they have any bias? OF COURSE THEY DO! But of course, they're on 'our' side, so it's okay if they are biased. Whatever.
  • Ad hominem (Score:3, Interesting)

    by Salamander ( 33735 ) <`jeff' `at' `pl.atyp.us'> on Saturday February 14, 2004 @11:37AM (#8279993) Homepage Journal

    The accusation of bias at the end does open source no credit; someone writing for O'Reilly could be accused of bias as easily as someone writing for DevX. Stone would have done better to leave that out, and read one of the advocacy FAQs instead. DevX itself hosts a better rebuttal [devx.com] than his.

  • by Todd Knarr ( 15451 ) on Saturday February 14, 2004 @11:49AM (#8280063) Homepage

    Jones says a malicious entity could ship a version of an open-source project with malevolent code in it. Well yes, but the same can be said about closed-source software too. There's been a few recent well-publicized attempts to insert malicious code into open-source projects, but so far nobody's actually managed to get that code shipped to end-users as part of an official release. If Jones is correct, then closed-source should do at least as well. Yet, over the years, I recall several major pieces of software that shipped with back-doors or viruses on the official media. These weren't just third parties distributing bad versions, this was malware on the official versions bought directly from the software maker and still in their shrink-wrap with their seals intact. Microsoft themselves in the not too distant past shipped a fairly obnoxious trojan program to their own developers on their own SDK CDs.

    Jones' assertion may be technically correct, but as with all of his assertions a simple check of the track record shows that it's closed-source, not open-source, that has the larger problem by far.

  • by YoJ ( 20860 ) on Saturday February 14, 2004 @12:02PM (#8280133) Journal
    One of the points the article mentions is that there have been numerous server compromises of machines hosting open source code, which is worrying. What if that happened and nobody found out? I believe this is a legitimate worry, and am working on developing a security model for version control tools, Majestic [ucsc.edu].

    However, there is some confusion in the article about what security means. One aspect of security is authenticity and integrity; another is secrecy. When you check the MD5 checksum on a download, you are checking the integrity of the files even though the contents are publicly available. Having the source code freely available can only help the quality of projects, and does not necessitate compromising code integrity.

  • How many people work at software development companies that sacrifice quality to meet a deadline that sales or marketing proposed to the customer?

    How about a company [thescogroup.com] thats taken a new and possibly bad direction because one of the executives or a newly appointed CEO [thescogroup.com] wants to impress shareholders [sco.com] and make money for themselves?

    Point being, OSS projects are typically written on a timeline based on one requirement, is the project ready for the release?

    It has always been my opinion that publicly traded companies are ruined by their shareholders.


  • Glass houses.... (Score:4, Insightful)

    by thewiz ( 24994 ) * on Saturday February 14, 2004 @12:32PM (#8280324)
    Think of proprietary software as a normal home with wooden or brick walls, roof, shades on the windows and locked doors.

    Think of Open Source Software as a glass house where everything is transparent and anyone can look inside to see what's going on.

    Wouldn't it be easier to see if there is something malicious going on inside a glass house than inside a normal house? Does Jones really think a burgler would try to rob a glass house? I certainly hope not! People with malicious intent prefer to HIDE their actions, whether it's sneaking in a home's back door or distributing an encypted binary with malicious code, because they don't want to be caught.

    No sane burgler is going to rob a home where everyone can see what they are doing. Anyone who adds malicious code to an OSS project will get caught just as fast.
  • by geekee ( 591277 ) on Saturday February 14, 2004 @05:51PM (#8282255)
    The issue isn't whether or not secrecy provides security. The issue is, what is the motive for writing the code. If a company is writing the code, unless you're a conspiracy theorist, the company is writing the code to sell and make money. Adding security flaws purposely would harm this primary interest if caught, and cause the customer to find someone else, if possible. Therefore, it is not in a company's interest to introduce security flaws into code. Now, with open source code, the motive of a particular programmer is less clear. He's not getting paid, so he either wants to write code so he can use it for himself, gain some leel of fame, etc. It would be easier, however, if the motive was to compromise the security of a software product, to join an open source project and sabotage it, than to try and gain employment at a software company and do the same thing.

"A mind is a terrible thing to have leaking out your ears." -- The League of Sadistic Telepaths

Working...