Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security

Is Open Source Fertile Ground for Foul Play? 723

jsrjsr writes "In an article DevX.com entitled Open Source Is Fertile Ground for Foul Play, W. Russell Jones argues that open source software is bad stuff. He argues that open source software, because of its very openness, will inevitably lead to security concerns. He says that this makes adoption of open source software by governments particularly worrisome. In his words: 'An old adage that governments would be well-served to heed is: You get what you pay for. When you rely on free or low-cost products, you often get the shaft, and that, in my opinion, is exactly what governments are on track to get.'"
This discussion has been archived. No new comments can be posted.

Is Open Source Fertile Ground for Foul Play?

Comments Filter:
  • by yar ( 170650 ) * on Thursday February 12, 2004 @04:03PM (#8261746)
    I wish people would use any kind of proof with this type of article... but I suppose they can't.

    "Unfortunately, the model breaks down as soon as the core group involved in a project or distribution decides to corrupt the source, because they simply won't make the corrupted version public."

    And of course there just CAN'T be any guard against the actual program being implemented differing from the publicly available source... :P

    "I'm not naive enough to think that proprietary commercial operating system software doesn't have the same sort of vulnerability, but the barriers to implementing them are much higher, because the source is better protected."

    And when those holes are discovered, they aren't published at all. And the proprietary owner has a far more difficult time finding these existing holes themselves. And most of all, there's NOTHING STOPPING THE PROPRIETARY OWNER from implementing this same type of worst-case scenario the author of this piece describes, and an even smaller chance of discovery by outsiders. Sheesh.
    • by Anonymous Coward on Thursday February 12, 2004 @04:07PM (#8261802)
      Wow, an insightful first post.
      This day will go down in history.
    • by LostCluster ( 625375 ) * on Thursday February 12, 2004 @04:09PM (#8261837)
      Yeah, OSS software is at risk of exploits, but he's neglecting the fact that once geeks realize that they can't compile the open source version to the binary, a red flag goes next to the binary. And if the binary starts doing malware things, then that binary goes down in flames, and the project will immediately fork with the last released source.
      • absolutely right - 90% of all software I install on my box is compiled from source, I hardly ever use the vendor provided binaries. And I guess that a lot of other people do the same. Of course there are limits to what we can notice at a glance, but if things behave strangely, imho the first thing to do is compare the supplied binaries with binaries compiled from the available source...
      • My guess is that the curve for open source is a lot different than commercial software.

        Open source - starts off, lots of exploits because the code is readily available. People using the package (assuming it's valuable enough to merit it) fix problem, submit patches. Over time software becomes more secure.

        Closed source - Exploits harder to find, eventually found due to sheer perseverance of legions of script kiddies and their slightly more talented bretheren. Company denies existence of problem, patches discreetly and only occasionally, eventually begins to become marginalized due to shoddy business practices, begins suing everyone in sight in a sad attempt to revive an obviously dying business. Meanwhile, Bill Gates rolls over in his sleep, makes another fifteen million dollars.

        (Or maybe I've just had too much coffee today, and am being silly. Time will tell.)
        • by G27 Radio ( 78394 ) on Thursday February 12, 2004 @06:54PM (#8264275)
          The big problem with the closed source model (as we may be about to find out first hand) is that once the source gets leaked, all those holes are out in public. The security through obscurity design model kinda falls apart at that point.

          The guy that wrote the original article is definately trolling. Unless he really is a fool. I think anyone with even a little insight into how OSS works understands why it's inherently MORE secure than close source. This "closed source is more secure" meme gets floated and shot down several times a year.
      • once geeks realize that they can't compile the open source version to the binary

        A small and ever-decreasing percentage of users compile their own binaries, let alone check the result. Also, not all of the exploits appear only in the binary; in at least one case the malefactors added a fairly hard-to-notice security hole to the CVS source, so the "official" binaries and checksums matched just fine.

        • by Tony-A ( 29931 ) on Thursday February 12, 2004 @05:23PM (#8263090)
          A small and ever-decreasing percentage of users compile their own binaries, let alone check the result. [Emphasis added]

          Compare:
          50% of 10 is 5 .05% of 100,000 is 50
          I'd much rather have .05% of 100,000 checking than 50% of 10.

          It takes very few to notice something peculiar and investigate. The malefactors get caught out if anybody notices anything. Since anybody can examine everything of interest, it would be extremely difficult for a malefactor to actually accomplish much of anything against Open Source.

        • by blorg ( 726186 ) on Thursday February 12, 2004 @05:32PM (#8263223)
          "A small and ever-decreasing percentage of users compile their own binaries, let alone check the result."

          I think the government might just have the time to make this sort of check, and as others have said, it only takes one person to notice. Your second point is valid, as is born out by the Debian/micq dispute [markpasc.org] (also mentioned previously in these comments), but that ironically isn't a point that Jones attempted to make in the article - he seems to be concerned with unpublished back-doors that don't appear in the source.

        • by gaj ( 1933 ) on Thursday February 12, 2004 @05:41PM (#8263337) Homepage Journal
          It doesn't matter what percentage of users compile and check. Only that some do and that there is a way to get the word out.

          Some do. I'm proof by existance.

          There is a way to get the work out. /., USENET, mailing lists and distro alerts are just a few ways.

          As for the malware in the source, you are of course correct. However, it is exposed, so therefore can be found. In fact, will be found, eventually.

    • by thegrommit ( 13025 ) on Thursday February 12, 2004 @04:09PM (#8261838)
      I wish people would use any kind of proof with this type of article... but I suppose they can't.

      Who needs proof when you have FUD? See also SCO.
    • by Rev.LoveJoy ( 136856 ) on Thursday February 12, 2004 @04:13PM (#8261903) Homepage Journal
      Bingo.

      The author completely ignores the storied history of exactly this kind of thing in closed source software -- only these backdoors are called 'features' or 'easter eggs.'

      We need a new term for this kind of journalistic troll.

      -- Cheers,
      -- RLJ

      • by Wyatt Earp ( 1029 ) on Thursday February 12, 2004 @04:26PM (#8262107)
        "We need a new term for this kind of journalistic troll."

        No talent assclown.
      • by stevesliva ( 648202 ) on Thursday February 12, 2004 @04:35PM (#8262205) Journal
        We need a new term for this kind of journalistic troll.
        Yellow journalism [wikipedia.org]

        Although it doesn't quite fit since this is technically a commentary or opinion piece, in which case, "ignorant fool," would suffice.

      • by SvendTofte ( 686053 ) on Thursday February 12, 2004 @04:49PM (#8262452)
        Email the author. I just did, rebutting two of his "points". rjones@devx.com [mailto]

        Hey Russel,

        Just two obvious points of rebuttal.

        1. Your question:

        Who's Watching the Watchers?

        Makes a cold chill run down my spine, when I think of closed source
        software. In fact, many of your statements, such as the rogue coder,
        holds just as true, for CSS. The difference? You (as a consumer)
        cannot see the code. At atmosphere, which breeds closedness, and
        non-disclosure of hacker attacks, is far more scary, then one (such
        as Debian), which openly announces, that it has been hacked. Imagine
        a hacker gaining access to Microsoft code. Imagine MS catching him,
        and removing the malicious code. But ... did they get it all? Only
        the hacker will ever know.

        Your statement, that "core" members, will port the code, just doesn't
        make sense. Assuming we're not into the old chicken and egg problem,
        with the bootstrapping compiler, an Open Source project, is defined
        as having the source open. If you compile a program, and it ends up
        different, then the one you downloaded, then something is very
        wrong indeed.

        2. In academia, and security circles, full disclosure, to be able to
        repeat trials, and be able to uncover weaknesses in software, is the
        norm. Hiding behind binary code, does not a very powerfull brickwall
        make. Hiding behind a wellthought out design, which is not open to
        attacks (confirmed by peerreview), and relies on algoritmic
        defences, makes a strong brick wall.

        I am sorry, but all in all, a very poor article.

        Regards,
        Svend
      • by pohl ( 872 ) on Thursday February 12, 2004 @05:16PM (#8262973) Homepage
        We need a new term for this kind of journalistic troll.

        Urinalist?

      • by blorg ( 726186 ) on Thursday February 12, 2004 @05:23PM (#8263092)
        This story makes no sense whatsoever. From what I can work out, he's saying that although the source may be auditable, back-doors could be introduced (but not made public) before it is compiled into a distro. Leaving aside the obvious GPL violation :-) he seems to be saying that someone in Red Hat, for example, would be introducing the back-door. But how is this any different than someone in Microsoft doing so with Windows, except that the source was never available in the first place? And why, exactly, would Red Hat be likely to do this while Microsoft does not? It just doesn't make sense. Indeed, Microsoft only launched it's Shared Source Initiative [microsoft.com] and Government Security Programme [microsoft.com], allowing restricted access to the Windows source, because it acknowledged source auditability to be an advantage of open source.
      • by chadjg ( 615827 ) <chadgessele2000@@@yahoo...com> on Thursday February 12, 2004 @05:35PM (#8263251) Journal
        My boss used to do custom business software and database programming back in the big iron days. He said that in order to do customer support they would often build in a way to shell into the machines remotely to do the diagnostics.

        No problem there. But the kicker was that he would build back doors into the programs that only he knew about, so if they changed the front door passwords or otherwise screwed it up, he could still get in.

        The big problem was that he wouldn't tell his customers about these back doors. This is financial and tax data we're talking about. He saw no ethical problem with this. None at all. Fortunately he's not a malicious guy,

        This isn't a suprise to anybody, right? I was just shocked at the total and complete lack of guilt over doing this. And he's otherwise a normal guy. That's scary.

    • You're Absolutley right. People going around trolling about open source without any plausible reason is a major detriment to the cause and the software. Companies/corps are going to pick whatever works best for them and adapt/change with it to their needs and Gov't should do the same. if the security was as bad as the article implies it to be, then why havent we seen any catastophic security failures on any of the open source systems currently being used by fortune 500 and Gov't. Hell, it couldn't be any
    • You know something and this will probably get mod'd down quickly because many won't like the content... Oh well.. Thinking back to when the FSF servers were 'owned' or however you want to spin it, little mention was made of the repercussions that could have occurred - or could still occur - because of that hack. Instead all we heard was how great the security team was in assessing the incident with such quickness.

      Think about that outside the zealotry mode for a minute. I don't recall any follow up determining, "Hey this happened X_TIME ago, therefore clean programs should be reinstalled on your machine." Now I support the entire Open Source movement by all means, but think about how many include files, or other files could have been tweaked. Say low level include files, or something similar. There is no one, and I say this COMFORTABLY, no one that checks every program, every line of code on their machine. Sure you could lsof|grep -i listen every here and there to see what's what, but a covert chan can hide that. Look I don't want to get into a sysadmin/secadmin shootout here it'd be a draw and I don't care who you are, but... In my eyes, there is still a long way to go.

      Take a look at cpan and some of the modules you have on your machine. How many are updated with normalcy? What about the whole sourceforge/freshmeat concept of 'sysadmining', where you find a neat program supported for what... a year? Maybe 2 if you're lucky... Sometimes it seems the cool Open Source gets, the more issues come out with it...

      Every step you take... someone is watching you [politrix.org]

      • 'I don't recall any follow up determining, "Hey this happened X_TIME ago, therefore clean programs should be reinstalled on your machine."'

        That's because the relevant teams _checked_ the code against known good code to see if there had been anything planted. If there were problems, you would have heard about them.
      • They're called .md5s. Use them. They exist for a reason. You'd have to have some godawful cooperation between some very mean people to successfully pull off a corruption on widely deployed OSS software AND not throw red flags up among people who have clean versions and clean md5 hashes.

        And, what's you're point on stagnant OSS projects? I don't see Microsoft supporting Win3.1 anymore, but there's a lot of people still using that. The difference is that NOBODY can go through it and fix it up or make anything of it. If someone decides to pick up the pieces on an abandoned piece of OSS that shows promise they can do that.

        I hate when people do this. You didn't raise any issues that aren't a problem with ALL software, yet you are applying them specifically to OSS. If a server gets owned, it gets owned. It doesn't matter if it's commerical/proprietary, commercial/oss, or whatever. It's owned. Binaries can still be injected with malicious code. They're owned. Give it up. There's no inherent flaw in OSS.

      • by Jerf ( 17166 ) on Thursday February 12, 2004 @04:41PM (#8262266) Journal
        I think you've kind of missed the point here. The question isn't "Is Open Source invincible?", the question is "Is deliberate program corruption more likely to occur, all else being equal, in an Open Source program or a commercial program?"

        And while I'm not a free or open source fanatic, I have to say that I can't marshall any rational arguments that the commercial program is somehow safer from authorial corruption. It's virtually inconceivable that a large scale open-source program could have a backdoor or anything like that in it for any significant amount of time, and as for smaller projects, a one-man open source project may be just as likely to be corrupted as the one-man closed source product, but which is more likely to be detected before significant damage is done? The one with the source you can look at, hands down. (And the phrase "just as likely" is for rhetorical purposes; in the real world, the prospect of revealing the source surely impedes anybody who would put something nasty in there! That's way too accountable for someone like that's taste!)

        No system can be made perfectly safe. But to claim that commercial software is safer from deliberate authorial corruption takes willful and deliberate ignorance. I mean, seriously, claiming that the software I can't see, that I'm not allowed to see, is more likely to be pure then the stuff anybody (or anybody I hire) can look at is? That flies in the face of both logic and common sense, and is the kind of claim that has be inflated into an long article to blind the reader with words before it can even come close to being seriously entertained; a paragraph summary doesn't pass the laugh test.

        And remember, it's not only "Will it happen?", but "Which will do more damage?" Even when break-ins happen in Open Source, the damage is typically swiftly controlled; people's reputations are on the line! Who even knows how much closed-source damage has been caused from breakins? Again, people's reputations are on the line, and the incentives to cover such things up are high.

        I just don't see a way, even in theory, where commercial software is safer against this sort of attack.
  • by Raindance ( 680694 ) * <johnsonmx@gmai[ ]om ['l.c' in gap]> on Thursday February 12, 2004 @04:04PM (#8261764) Homepage Journal
    'You get what you pay for'?

    Seems like W. Russell Jones is trying to apply 1900-era economics to a collaborative, abstract, not-truly-market-driven, positive-feedback context.

    There might be security concerns with Open Source (he, most interestingly, doesn't go into security concerns with closed source or compare track-records); however, Russell is trying to pull a fast one as this is a different (and, I'd argue, wrongful) criticism of OS.

    RD
    • by haystor ( 102186 ) on Thursday February 12, 2004 @04:14PM (#8261934)
      The irony is that his article is freely available.
    • Sort of (Score:5, Interesting)

      by gerf ( 532474 ) on Thursday February 12, 2004 @04:14PM (#8261938) Journal

      His criticism reminds me of a speaker at a recent IEEE meeting at my school. She talked about the work environment, and some nuances of how to act or not to act.

      One interesting thing about her contracting company she runs, is that if you charge more, you get more business. The thought here is that companies think that since this certain company costs more, it must be better. Obviously though, she did not get smarter by charging more, only richer.

      That is the thinking that this fellow is using: chargine more must mean it's a better product. Sadly, he is in a large part of the population that does not understand the Open Source community, or business models. His view is outdated, and frankly, wrong.

      Besides, what other companies besides M$ find a huge hole in all of their flagship products, but fail to patch it for close to a year?

    • yeah, it seems he's never paid for a BSOD! Unfortunately, neither has Microsoft. But when I get my hands on them, they'll pay. Oh, how they'll pay!!
  • Wow (Score:5, Funny)

    by daeley ( 126313 ) * on Thursday February 12, 2004 @04:04PM (#8261766) Homepage
    Igniting flame war in 5...4...we have main engine start...3...2...ignition!...1...
  • Ahhh.. (Score:5, Funny)

    by Jeremiah Cornelius ( 137 ) on Thursday February 12, 2004 @04:04PM (#8261768) Homepage Journal
    An article-length Troll.

    The whole thread that will light-up in response to this old chestnut!

  • hrm... (Score:3, Insightful)

    by xao gypsie ( 641755 ) on Thursday February 12, 2004 @04:05PM (#8261770)
    i disagree....if there is a security hole, those implementing the software would ideally know enough to pick up on it fairly quickly. i mean, they do have the source, after all...
  • What a sellout (Score:5, Insightful)

    by dtfinch ( 661405 ) * on Thursday February 12, 2004 @04:05PM (#8261771) Journal
    Everything he claims can go wrong with open source can go wrong with closed source, but with closed source you have fewer people watching to catch malicious code additions before stable release.
  • by tcopeland ( 32225 ) * <tom@th[ ]sleecopeland.com ['oma' in gap]> on Thursday February 12, 2004 @04:05PM (#8261772) Homepage

    Worse though, I don't think that security testing can be made robust enough to
    protect against someone injecting dangerous code into the software from the
    inside--and inside, for open source, means anyone who cares to join the project
    or create their own distribution.

    Bosh. Open source project leaders - especially the leaders of popular projects - don't let just anyone have write access. Also, commits almost always go to a mailing list to be reviewed by the other committers and lurkers.

    And of course, there's no way a commercial product could be infiltrated by someone who wants to inject harmful code. Impossible!
  • PLOFIT! (Score:3, Funny)

    by Anonymous Coward on Thursday February 12, 2004 @04:05PM (#8261778)
    1) Write bogus article that will enrage slashdotters. Slashdot, being knee-jerk as it is, posts it to the front page.
    2) Get a bazillion hits.
    3) PLOFIT!
  • by uqbar ( 102695 ) on Thursday February 12, 2004 @04:06PM (#8261784)
    Releasing this kind of rhetoric just days after the latest MS security fiasco would be funny - if the reality wasn't so sad...
  • by Eric Smith ( 4379 ) * on Thursday February 12, 2004 @04:06PM (#8261787) Homepage Journal
    Closed source software, because of its very closedness, will inevitably lead to security concerns. This makes adoption of closed source software by governments particularly worrisome. When you rely on proprietary products, you often get the shaft, and that, in my opinion, is exactly what governments are on track to get if they fail to switch to open source software.
  • by LostCluster ( 625375 ) * on Thursday February 12, 2004 @04:06PM (#8261789)
    I doubt Microsoft will ever write software for Linux, but it's inevitable that that things like Lindows will forever strive to make Linux as easy as Windows because that's essential for Linux to take over the desktop market.

    However, with that, some of the inherent security of Linux fails. Imagine an e-mail client that will execute a binary attachment with no questions asked because the user double-clicked on the pretty icon. That's how MyDoom spread on Windows, and basically, it's the fact that the current setup for Linux makes it hard to execute something new that makes people realize what they have before they run it...

    As soon as we have pretty looking greeting card executables that run on Linux, the downfall will be what comes next...
  • Um, yeah (Score:5, Insightful)

    by Cthefuture ( 665326 ) on Thursday February 12, 2004 @04:06PM (#8261790)
    Please cite some specific examples Mr. Jones.

    I mean, there is a whole friggin lot of open-source out there, there's bound to be a few examples of the problem? Right? Right???
  • by AtariAmarok ( 451306 ) on Thursday February 12, 2004 @04:07PM (#8261799)
    He might be right. If governments switch from Windows to open-source OS, they might open their computers to the possibility of being infected by worms, virii, and trojans.
  • Netcraft says that his server (running IIS) has only been up for 2 days.

    I wonder if he's getting what he paid for.
  • Take action (Score:5, Informative)

    by Strudleman ( 147303 ) <strudleman@strud[ ]an.com ['lem' in gap]> on Thursday February 12, 2004 @04:07PM (#8261803) Homepage Journal
    All these great reply's, these reasons why Russell is wrong, will never be read by the public because they're stuck in /.

    Take a cue from devX: "Editor's Note: DevX is pleased to consider rebuttals and related commentaries in response to any published opinion. Publication is considered on a case-by-case basis. Please email the editor at lpiquet@devx.com for more information."
  • My God! (Score:5, Insightful)

    by shystershep ( 643874 ) * <bdshepherd.gmail@com> on Thursday February 12, 2004 @04:08PM (#8261818) Homepage Journal
    He's a genius! This is actually a clever critique of the very dangers of closed source software, just disguised as a moronic attack on open source.

    Open source advocates rightfully maintain that the sheer number of eyes looking at the source tends to rapidly find and repair problems as well as inefficiencies--and that those same eyes would find and repair maliciously inserted code as well. Unfortunately, the model breaks down as soon as the core group involved in a project or distribution decides to corrupt the source, because they simply won't make the corrupted version public.

    I mean, this can't actually be an argument that closed developed by a "core group" that "won't make the corrupted version public" is more trustworthy than open development where anyone can see the code. Right? Right?
  • by JohnGrahamCumming ( 684871 ) * <slashdotNO@SPAMjgc.org> on Thursday February 12, 2004 @04:08PM (#8261819) Homepage Journal
    This is simply the worst piece of FUD concerning Linux and OSS in general that I've ever read. And it's coming from the "Executive Editor" who should have taken a look for some actual examples of what he's talking about. The entire article is random speculation that "bad things can happen" with OSS because people can modify the source and he should be ashamed of having written it: unless of course he's being paid to write propaganda.

    During a week when Microsoft admits it sat on the worst flaw ever for 6 months, and MyDoom and friends are rampaging around it's shameful to see an article written with so much fear and so little substance. He even manages to say that OSS might be used by terrorists against the US (although he doesn't use the word).

    An absolutely disgusting piece of "journalism".

    John.
  • by Godeke ( 32895 ) * on Thursday February 12, 2004 @04:08PM (#8261821)
    While the article mentions that the exact attacks that you say could happen in open source software could also happen in closed commercial software, I find the "barriers to implementing them are much higher" concept to be absurd. Just as the articles sasy the core Linux kernel is tightly monitored, so is the software from Microsoft. However, when it comes to smaller products, products that I have worked on, I would have to chuckle at the naive view that somehow closed source is "better protected". Most smaller companies that I have worked with are *far* more interested in getting a product to release than checking for backdoors. Testing is for failure modes, not for subtle pointer errors that open the code to obscure exploits.

    In open source software, the maintainers vet patches by peer review before admitting them into the main product line. Likewise, closed source products are peer reviewed, but by a much smaller team, who probably have much more similar agendas than people flung across the globe. Either could be compromised. This exact same article could have been entitled "Software Is Fertile Ground for Foul Play". The concern that backdoors exist is the reason Asian countries have been suspicious of Microsoft's closed source software. To assuage those fears, Microsoft provided the source code for review. If this review is successful in showing that no backdoors exist (and I have no idea how they can tell that some unobtrusive code isn't deliberately flawed) then surely open source can be equally reviewed, if not suffer a more stringent review by opening the question to the open source community within the country in question.

    The security that closed source promises by "protecting the source" is security through a promise by a potentially hostile vendor. The security open source promises is the vigilance of those who review the code. I don't see how one is better than the other, but I surely don't see how closed source is going to make a potential target feel better than if they could review the source.
    • Yes. Mr. Jones needs to read up on why governments actually prefer [com.com] open source.
    • by BranMan ( 29917 ) on Thursday February 12, 2004 @05:24PM (#8263105)
      Actually, in practice there has seldom been any peer reveiw of code in 'closed source' software companies. Unless a project or program has major funding, clout, and visibility, the coders write some unit test cases and hope any bad bugs are caught in system testing (which gets reduced when the schedule gets tight - in contrast Open Source software usually has no schedule). Open Source software is therefore infinitely more secure as more often than not at least 2 pairs of eyes have seen any particular piece of code.
  • by joshamania ( 32599 ) <jggramlich@NosPam.yahoo.com> on Thursday February 12, 2004 @04:08PM (#8261822) Homepage
    This is the type of argument you get from a lawyer, a technophobe or someone with a vested interest in being anti-open source. Arguments generally center around "security" "support" and "accountability".

    One, Microsoft software, the most popular "closed source" software in the world, is rife with security holes. While the most popular (arguably) open-source software in the world, Apache, doesn't strike me as being terribly buggy *or* full of security holes. For instance, I don't have to update my apache software once a week.

    Two, often for popular open-source products there is plenty of free and timely support. Advantage is also to the qualified technophile, who can support his or her own software, and not rely on the timetables of vendors.

    Three, accoutability. What has Microsoft *ever* been accountable for? Viruses? Bugs? Data loss?
  • by Phaid ( 938 ) on Thursday February 12, 2004 @04:09PM (#8261828) Homepage
    Mod story down (-1, troll).

    Can we please stop letting people use slashdot to increase the hit rate on their articles in order to make themselves seem relevant to their bosses?

    Fred moody, the infamous anti-Linux ABC News columnist, was doing the exact same thing [linuxtoday.com] four years ago. In fact, he was writing on pretty much the same subject, that Open Source is insecure and untrustworthy by its very nature.

    Those who do not study history are doomed to repost it.
  • by W2k ( 540424 ) on Thursday February 12, 2004 @04:09PM (#8261833) Journal

    Open Source Is Fertile Ground for Foul Play

    The nature of open source makes security problems an inevitable concern. There are a handful of ways that malicious code can make its way into open source and avoid detection during security testing, making government adoption of open source particularly worrisome.

    by A. Russell Jones February 11, 2004

    An old adage that governments would be well-served to heed is: You get what you pay for. When you rely on free or low-cost products, you often get the shaft, and that, in my opinion, is exactly what governments are on track to get. Perhaps not today, nor even tomorrow, and not because open source products are less capable or less efficient than commercial products, but because sooner or later, governments that rely on free open source software will put their country's and their citizens' data in harm's way. Eventually--and inevitably--an open source product will be found to contain a security breach--not one discovered by hackers, security personnel, or a CS student or professor. Instead, the security breach will be placed into the open source software from inside, by someone working on the project.

    This will happen because the open source model, which lets anyone modify source code and sell or distribute the results, virtually guarantees that someone, somewhere, will insert malicious code into the source. Malevolent code can enter open source software at several levels. First, and least worrisome, is that the core project code could be compromised by inclusion of source contributed as a fix or extension. As the core Linux code is carefully scrutinized, that's not terribly likely. Much more likely is that distributions will be created and advertised for free, or created with the express purpose of marketing them to governments at cut-rate pricing. As anyone can create and market a distribution, it's not far-fetched to imagine a version subsidized and supported by organizations that may not have U.S. or other government interests at heart.

    Third, an individual or group of IT insiders could target a single organization by obtaining a good copy of Linux, and then customizing it for an organization, including malevolent code as they do so. That version would then become the standard version for the organization. Given the prevalence of inter-corporation and inter-governmental spying, and the relatively large numbers of people in a position to accomplish such subterfuge, this last scenario is virtually certain to occur. Worse, these probabilities aren't limited to Linux itself, the same possibilities (and probabilities) exist for every open source software package installed and used on the machines.

    How Can This Happen?
    The products of the open source software development model have become increasingly entrenched in large organizations and governments, primarily in the form of Linux, a free open-source operating system, the free open-source Apache Web server, and open source office suites. There are several reasons that open source software--and Linux in particular--are seeing such a dramatic uptick in use, including IBM's extensive Linux support effort over the past several years, and the widespread perception that Linux is more secure than Windows, despite the fact that both products are riddled with software security holes. (Use this menu to see the number of vulnerabilities reported by security watchdog group Secunia for an OS-by-OS comparison.)

    So far, major Linux distributions such as Debian and others have been able to discover and remedy attacks on their core source-code servers. The distributions point to the fact that they discovered and openly discussed these breaches as evidence that their security measures work. Call me paranoid, but such attacks, however well handled, serve to raise the question of whether other such attacks have been more successful (in other words, undiscovered). Because anyone can create and market--or give away--a Linux distribution, there's also a reasonably hi

  • Vulnerable? (Score:3, Funny)

    by Anonymous Coward on Thursday February 12, 2004 @04:09PM (#8261843)

    He argues that open source software, because of its very openness, will inevitably lead to security concerns.

    Well, thankfully Windows is closed-source, or else there'd be security issues wi-- oh, hang on a sec.

  • devx.com

    HTTP/1.1 200 OK
    Server: Microsoft-IIS/5.0
    Date: Thu, 12 Feb 2004 21:06:06 GMT
    X-Powered-By: ASP.NET

    In other news, the devx.com website was found lying in its own blood and excrement after being linked from Slashdot.ORG today.
  • Impartiality (Score:5, Informative)

    by gowen ( 141411 ) <gwowen@gmail.com> on Thursday February 12, 2004 @04:09PM (#8261848) Homepage Journal
    I believe every word of this article because A Russell Jones [amazon.com] certainly [amazon.com] has no vested interest [amazon.com] in Microsoft based web solutions.
  • by xutopia ( 469129 ) on Thursday February 12, 2004 @04:10PM (#8261852) Homepage
    it currently has a score of 2/5. Once the /. effect is done we should all create an account and rate it as low as it can go.
  • by mccalli ( 323026 ) on Thursday February 12, 2004 @04:10PM (#8261865) Homepage
    " When you rely on free or low-cost products, you often get the shaft, and that, in my opinion, is exactly what governments are on track to get"

    Aah, the sweet sweet tones of language in the hands of a master. What subtlety, what charm, what wit. Prithee kind sir, wherefore is thy prose, thy grasp upon the fundamentals comprising the very art of speech itself?

    English Grade: C-, should learn not to use informal language when making a formal argument.

    Cheers,
    Ian

  • Open source software goes through rigorous security testing, but such testing serves only to test known outside threats. The fact that security holes continue to appear should be enough to deter governments from jumping on this bandwagon, but won't be.

    *Deletes 40 zillionth mydoom attachment in his inbox*, and I suppose other operating systems are more secure...what exactly are you suggesting we do about the lack of security in today's OS's? Linux, Windows, Unix even have all identified security flaws in their time...

    What can we trust in code? You mention it right there Mr. Author, we can trust the latest and greatest stable Linux kernels, but if install a test kernel, or some hobbyist lil' app on the remote corners of the open source world on a production server, you get what you deserve. Incidentally the same goes for windows, WinXP latest Service pack is definitely more secure than any test versions of their OS's, or even the initial RTM builds of their operating systems. What gets deployed in a production environment...well duh....

    The author says:

    [Snip] Worse though, I don't think that security testing can be made robust enough to protect against someone injecting dangerous code into the software from the inside--and inside, for open source, means anyone who cares to join the project or create their own distribution.

    I suppose we trust Microsoft, SCO and IBM more? Puh-leez, if you need a totally secure OS, you're best off hiring your own programmers and starting from scratch, and hoping they're as secure as anyone else, oh wait can't trust them either...never mind just build an OS yourself then...

    Ok I'm done ranting, everyone else's turn :-).
  • by JaredOfEuropa ( 526365 ) on Thursday February 12, 2004 @04:12PM (#8261895) Journal
    An old adage that governments would be well-served to heed is: You get what you pay for. When you rely on free or low-cost products, you often get the shaft, and that, in my opinion, is exactly what governments are on track to get.
    So far, I think the track records of currently existing operating systems speak for themselves: one particular popular commercial operating system (yes, that one) makes the news almost weekly with another gaping security hole, exploit, or worm doing the rounds. On the other hand, you don't hear a lot about security issues with (wonderfully-free) Linux systems, despite their widespread use as servers.

    A number of governmental institution have chosen Linux not because it is free, but because of another distinct advantage: because it is open-source, they know what they pay for.
  • by doomicon ( 5310 ) on Thursday February 12, 2004 @04:13PM (#8261906) Homepage Journal
    Joe Barr, already has an article [newsforge.com] responding to this FUD. I personally feel these sorta FUD articles are outdated. With IBM, HP, and others already showing large profits from taking advantage of opensource, you would think they would come up with something that isn't drudging up arguments from 1998.
  • Best Troll Ever. (Score:5, Interesting)

    by DaveJay ( 133437 ) on Thursday February 12, 2004 @04:14PM (#8261919)
    From the article, annotations added by me:

    >Malevolent code can enter open source software at several levels.

    1. >First, and least worrisome, is that the core project code could be compromised by inclusion of source contributed as a fix or extension. As the core Linux code is carefully scrutinized, that's not terribly likely.

    Not likely indeed. Moving on.

    2. >Much more likely is that distributions will be created and advertised for free, or created with the express purpose of marketing them to governments at cut-rate pricing. As anyone can create and market a distribution, it's not far-fetched to imagine a version subsidized and supported by organizations that may not have U.S. or other government interests at heart.

    Organizations using Open Source Distributions generally purchase a vendor-supplied copy as well as a support contract.

    As an aside, do you suppose non-US countries that use Microsoft products are concerned that Microsoft may not have their country's best interests at heart?

    3. >Third, an individual or group of IT insiders could target a single organization by obtaining a good copy of Linux, and then customizing it for an organization, including malevolent code as they do so. That version would then become the standard version for the organization. Given the prevalence of inter-corporation and inter-governmental spying, and the relatively large numbers of people in a position to accomplish such subterfuge, this last scenario is virtually certain to occur. Worse, these probabilities aren't limited to Linux itself, the same possibilities (and probabilities) exist for every open source software package installed and used on the machines."

    This isn't limited to Open Source itself. The same possibilities (and probabilities) exist for any company that uses customized software AT ALL -- at some point, you have to trust those doing the customizing, or get a third party to audit. I mean, after all, I can wreak havoc throughout an organization just by clever use of login scripts on Windows XP machines, and if everyone in the IT department is in on it, nobody else would be the wiser.

    Now that I think of it, even if you're not customizing the software, you're trusting the people who make it. Does Microsoft have your best interests at heart? Does SCO? Does RedHat? Does anyone? That's why it's nice to be ABLE to scour the code -- the smartest, safest groups will obtain source code from those who write it, and have it audited by another group, and then again perhaps by another. Unless they're all in league with one another. [Insert tinfoil hat here]

    So. Who's paying this guy?

  • by Bendebecker ( 633126 ) on Thursday February 12, 2004 @04:14PM (#8261920) Journal
    1. Use open source products which you can modify if need-be. For example, you can have your tech support modify it to make it better fit your business needs (compared to trying to modify your business to fit around a microsoft software solution) or if a bug is doscovered you could either wait for the developement team that orginally made it to fix it or you could fix it yourself. Heck, you could even have your tech guys go through the code themsleves looking for security holes to fix.

    2. Use closed source. If a bug appears, your at the mercy of Microsoft to fix it. That may mean months waiting while your system is vulnerable. No way to find the bugs, no way to fix them yourself. Your business could be relying on a time bomb and not even know it. And of course, with only the MS guys looking for holes, the chance they'll miss them is greater. More eyes scanning code usually means less bugs. And any time Microsoft could decide to drop the product or force you to upgrade or pay overcharged rates for licenses, all at Balmer's whims. Going with closed source is putting your business at the mercy of Microsoft (yes, I know closed source != just microsoft but what is easier: to type closed source or to simply type MS?)
  • WTF? (Score:3, Informative)

    by jjp5421 ( 659783 ) * on Thursday February 12, 2004 @04:14PM (#8261932) Homepage
    You get what you pay for? Examples: SCO UNIXWARE, Windows, MS-DNS, IIS, bea weblogix, etc.. Realization: I paid for crap!!! You get MORE THAN what you pay for! Examples: Linux, *BSD's, BIND, Apache, gcc, etc. Realization: Why did I pay for that crap??? The code from Diebold was closed, and how secure was it? Windows code is closed and I had to install a server just to keep the hoard of daily patches up to date. I think that the key to secure code is not a debate of open v. closed it is about having a programmer/company that cares about security and knows what they are doing. Hell NetBSD is open and very secure (read:unusable). This guy is a moron.
  • by maroberts ( 15852 ) on Thursday February 12, 2004 @04:14PM (#8261937) Homepage Journal
    ...but governments and organisations should be exercising a modicum of care over who they get their source and binaries from. Thats what MD5 checksums and trusted sources are there for.

    Open source development is not truly open to everybody; it is normally open to everyone who you allow to contribute code to your project. They've normally proved themselves by offering bug fixes and mionor changes directly to you beforehand.

    The barriers to inserting malicious code in closed source are lower, not higher. Many an engineer has inserted a backdoor in his code which he surrepticiously used to help customers who lose passwords or setup info. However, a backdoor is just another way for a cracker to break into the system. Also bored engineers often leave Easter eggs in their closed source, something hard to do when several thousand people may review your code to see what makes it tick. In mainstream projects like Linux kernel, the bar to being allowed to contribute code is quite high, and your initial attempts are likely to be looked on with scorn by other project members.

    As for costing huge amounts of money, one wonders what cost MyDoom has been costing owners of that wonderful example of closed source software - Windows.
  • by rmassa ( 529444 ) on Thursday February 12, 2004 @04:15PM (#8261945)
    Quoth the author:
    • This problem isn't new. In fact, it's far older than any computer technology. The Latin phrase Quis custodiet ipsos custodies, which translates to "Who will guard the guards?" shows that people have been struggling with the same problem for centuries. You can set up as many layers of security as you like, but at some point, you have to trust the layers themselves. In short, open source free and low-cost software products are likely to be widely adopted in governments, where spending public money for licenses is a difficult justification. Inevitably, that choice will lead to security breaches that will cost those same governments (and ultimately you), huge amounts of money to rectify.


    Where exactly is the logic in this? In the open source world, at least there are "watchers", and you have the ability to "watch" yourself, or at least pay someone to review the code for you if you don't have the abilty. This isn't the case with almost all commercial software. This reeks of FUD and is poorly written.
  • by Psarchasm ( 6377 ) on Thursday February 12, 2004 @04:15PM (#8261950) Homepage Journal
    you might remember from other high quality works, like...

    Mastering ASP .NET with VB .NET [lowth.com], Visual Basic Developer's Guide to Asp and IIS [lowth.com],
    and...
    How To Kill Penguins With Broken Shards of Windows.

    *YAWN*
  • by Tom7 ( 102298 ) on Thursday February 12, 2004 @04:15PM (#8261957) Homepage Journal
    The marginal cost of all software is almost $0, because it costs almost nothing to copy bits.
    Just because Microsoft gouges you $X to do that copying doesn't mean that the bits are of any greater quality; Microsoft has poured loads of cash into developing its products, and the Free Software / Open Source folks have poured loads of volunteer time (and sometimes, cash) into developing their software. You might look at the amount of effort that has gone into creating each, and then try to apply the get-what-you-pay-for adage to that, but applying it to the price of the box on the shelf is ludicrous.
  • by FortKnox ( 169099 ) on Thursday February 12, 2004 @04:15PM (#8261958) Homepage Journal
    Quick, do an Amazon search for "A.Russel Jones" [amazon.com] (the author of the devx article).

    Visual Basic book, asp.net in C# book... looks like Mr.Jones is up to his ears in non-open source work. I hate having someone that has no background in something condeming it.

    Its like someone who is an ASP developer condeming Java before even coding a lick of it.
  • Almost speechless. (Score:3, Insightful)

    by nathan s ( 719490 ) on Thursday February 12, 2004 @04:15PM (#8261960) Homepage

    Having read the full article, I have to say that this is one of the most annoying pieces of writing I've read in quite a while. The author of this paper is assuming some naive elitist position in a fantasy world where corporate interests can never be anti-government and where code produced by the masses is somehow 'dangerous' because it might be exploitable.

    As several other comments have pointed out, there is absolutely nothing to the "foul play" argument presented in this article that could not also apply to a closed-source project. In my opinion, the major difference is that the closed-source project's flaws [and note that in this article the author is talking about deliberately introduced flaws - basically the idea that OSS projects might be converted into trojan horses], if they exist, might never be discovered at all. If I buy a copy of Windows, I have absolutely no clue whether or not any such flaws exist, but more importantly, I have no way to check because I can not examine the source code. At least with open source software, if I suspect misuse or even if I'm only paranoid, I can examine the source code myself or have someone knowledgable [whom I trust] do it for me

    Overall, this seems to be a pretty blind and poorly thought-out attack. A pity that editors aren't more carefully edited. :-P

  • by Angst Badger ( 8636 ) on Thursday February 12, 2004 @04:15PM (#8261962)
    The old saying about getting what you pay was formulated as a result of experience with commercial enterprises. Of course you "get the shaft" with "free" commercial products -- commercial enterprises don't exist for the purpose of giving things away. Companies only give things away in the hopes that you'll actually buy something.

    Open Source projects, on the other hand, are usually formed with the express goal of giving something away. They have every incentive to make their products valuable and no incentive to produce shoddy loss-leaders.

    "You get what you pay for," even with respect to for-sale products, doesn't mean "you get value commensurate with your expenditure". Commercial enterprises are strongly incentivized to give the least possible value for the highest possible price. Extra quality and value, above and beyond the expectations of the customer, is an unnecessary expense to a business. Competition alleviates this somewhat, but companies are still only playing to the level of the competition. Doing the very best possible will seldom if ever be their goal, in contradistinction to Open Source projects, where it is frequently the main goal.
  • Oh really? (Score:5, Insightful)

    by ShatteredDream ( 636520 ) on Thursday February 12, 2004 @04:16PM (#8261973) Homepage
    There is nothing preventing the U.S. Government's workers from modifying it to make it a security hardened version. The NSA's SELinux didn't have to be released back to the public. The NSA could have forked an entire distribution and gotten it really rock solid on security. The only reason they didn't was the value in our country of the government needing to return to the public what it creates with our tax dollars.

    That said, the best setup for the government is to use 3-4 platforms in each agency. MacOS X on the average desktop. Linux on the many of the servers. Windows on some print and file servers. Maybe some Sun boxes for intense science work. How many times does it have to be said that a heterogenous network is harder to take down before people stop writing this shit?

    As for the argument that Windows only gets hits more because of popularity... I want to wring the neck of every person I hear saying that. It's a disgusting display of post-modernist logic to computers. It's the IT variation of the post-modern attitude that there are no absolutes on morals, only relative standards that vary by cultural and personal views. It's a complete rejection of the concept that two systems can be designed such that one is inherently insecure because of its archetecture and that one is very secure by its design.
  • by mopslik ( 688435 ) on Thursday February 12, 2004 @04:18PM (#8262005)

    What bothers me most about these typical "OS vs Proprietary" flamewars-in-waiting is when writers compare specific applications with some nebulous "Open Source" concept. You've all seen reviews that go something like this:

    Open Source programs have serious problems. For example, I downloaded an Open Source command-line HTML-parser written by an undergraduate student. After feeding it random non-HTML files, the program crashed roughly half the time. By contrast, I evaluated the latest copy of Adobe Photoshop for Windows. Photoshop easily helped me modify my vacation photos, without a single glitch. Clearly, Proprietary applications are better suited for the market.

    Most of the time, these writers compare all open source programs -- many of which are hobby projects -- to individual, highly-polished applications. Hardly fair and unbiased.

    (now goes off to read the article)

  • No evidence (Score:5, Insightful)

    by 3Suns ( 250606 ) on Thursday February 12, 2004 @04:19PM (#8262008) Homepage

    It's interesting how he provides absolutely no evidence to support his claims. Obviously, nobody could take his stance and try to argue evidence, or else they would run into piles of evidence suggesting the exact opposite. This is sheer uninformed speculation. A couple choice quotes:

    Because anyone can create and market--or give away--a Linux distribution, there's also a reasonably high risk that someone will create a distribution specifically intended to subvert security. And how would anyone know?

    Same way people would know if someone was running a heroin production lab in the middle of Times Square. Open means open. If people create software designed to subvert security, they make closed software. Exhibit A: Gator/GAIN.

    Who's Watching the Watchers?

    Anyone who wants to. Clearly this person has no idea how Free/Open-Source software works at all.

  • by kenjib ( 729640 ) on Thursday February 12, 2004 @04:20PM (#8262019)
    Diebold is a perfect counterargument to this article. Here, proprietary source mixed with a documented conflict of interest has possibly led to intentional security backdoors with the potential of creating massive social upheaval in the most powerful country in the world. Furthermore, while Diebold is getting caught with it's hand in the cookie jar because of leaked code and internal memos, we don't even know at all what the other electronic voting software companies are doing with their closed and secret code. Perhaps Mr. Jones could give a current example from the open source community with the same scope and complexity.
  • by RichDice ( 7079 ) on Thursday February 12, 2004 @04:20PM (#8262027)
    Someday he hopes to be The Russell Jones.
  • by TheFrood ( 163934 ) on Thursday February 12, 2004 @04:21PM (#8262043) Homepage Journal
    From the article:

    Because anyone can create and market--or give away--a Linux distribution, there's also a reasonably high risk that someone will create a distribution specifically intended to subvert security. And how would anyone know?

    Oh, I don't know... maybe by looking at the source code?

    Turn it around now: Suppose a private company sold software with malicious code included to subvert security. How would anyone outside the company know?

    TheFrood
  • by rebel ( 27002 ) on Thursday February 12, 2004 @04:23PM (#8262073)
    ...his article is freely available.
  • by jimicus ( 737525 ) on Thursday February 12, 2004 @04:25PM (#8262095)
    Email to author of article & editors of devx

    Dear Mr. Russell Jones,

    In your article you make a number of interesting points, which I shall attempt to cover in order:

    1. An open source product will eventually contain a maliciously inserted security breach.

    On what grounds do you base this statement? How can you be certain that Microsoft haven't been paid by the CIA to place backdoors in Windows? Why, then, should any government which isn't in on such secrets trust Windows? How could a government be certain that it knew all such secrets?

    2. The core project code could be compromised.

    Quite true. However, there have been instances in the past where Microsoft's code has been compromised even when sitting on Microsoft's servers:

    http://www.theregister.co.uk/content/4/14265.html

    3. A distribution will be built with security holes for the express purpose of selling to governments.

    How do you know this hasn't already happened with Windows? You speculate much, but back up little. What kind of advertising budget would such a hacker require for gaining government mindshare?

    4. Insiders could "customise" a well-respected secure distribution.

    They already can. It's called "leaving accounts on the system". Or "logic bombs". Or "misconfigured systems". This problem has existed for almost as long as computers have.

    5. Finally, you speculate that nobody is "watching the watchers". What, however, you appear to have misunderstood is that the government organisation would have a full copy of the source code and could compile it themselves to confirm the resulting program is identical to the shipped version. They could then audit the source code - either in-house or pay an outside organisation.

    It is quite correct to state that "you have to put your trust in someone - who should you trust?". Otherwise the country would have to be run on every level entirely by one person, who would be responsible for writing, implementing and enforcing law. I'm not from the US but I'm sure your President would get tired of writing out all those speeding tickets!

    I would argue "you should trust someone who can prove they have nothing to hide".

    Open Source has nothing to hide. Come into the light.

  • by Mirkon ( 618432 ) <`mirkon' `at' `gmail.com'> on Thursday February 12, 2004 @04:27PM (#8262118) Homepage
    So, I guess I shouldn't take any of it seriously.
  • by BaronAaron ( 658646 ) on Thursday February 12, 2004 @04:33PM (#8262179)
    DevX.com has reported a recent drop off in website hits and has implemented a campaign to "leverage" the Slashdot masses.

    The new project entitled "Flaming Troll" was kicked off today with an article that would be very interesting and informative for your average Slashdot reader.

    So far the project seems to be a success ...
  • by SysKoll ( 48967 ) on Thursday February 12, 2004 @05:27PM (#8263159)
    As examplified in this story [slashdot.org], we have already seen attempts at inserting backdoors in the Linux kernel.

    The attempts failed because of the meticulous grooming given by the "many eyes" watching each open source release.

    Any one can write a new kernel patch. But getting these patches accepted is a whole different story.

    Conversely, years after the commercial, closed-source program Borland Interbase was released and used worldwide, it was found that it contained a back-door [cert.org].

    So recent history proves the article is wrong. Facts demonstrate exactly the opposite of what the article rants about.

    Conclusion: the article is an unsubstantiated troll written by a Microsoftie eager to fart FUD at the Penguin. Ignore.

  • by Squeamish Ossifrage ( 3451 ) * on Thursday February 12, 2004 @05:38PM (#8263308) Homepage Journal
    I submitted the following response in a letter to the editor:

    Dear Sir or Madam,

    I am concerned that Mr. Jones's column of February 11th, "Open Source is Fertile Grounds for Foul Play," indicates a significant misunderstanding of open-source development processes. The argument presented is that all software development carries the risk that malicious code will be inserted by insiders, and that open-source is especially vulnerable because more people are insiders. The first part is absolutely true, and applies to both closed- and open-source development as Mr. Jones acknowledges, but the second part does not stand up to scrutiny.

    Most open-source projects have only a small group of "core developers" who have the ability to modify the official source code, just as is the case with proprietary software development. Any malicious person could insert destructive code into his or her own copy, but not back into the official version. That leaves the possibility of intentional compromise by the core developers, or by subsequent distributors. The first is a risk, but less so than with proprietary software: The number of people in a position to corrupt the source is similar in both models, but the possibility of outside review reduces the danger for open-source software. Mr. Jones posits that core developers could avoid such scrutiny by not making the corrupted version public, but this is nonsensical: The version of the source code available for use is by definition also available for review.

    The other concern raised is that distributors who re-package open source software could add vulnerabilities. Again, this is possible, but no more so than with proprietary software. It's easy for an attacker to add malicious code to compiled binaries; indeed much pirated software is reported to contain viruses or Trojan Horses. For both open-source and proprietary software, the solution is the same: Be careful who you get your software from. Downloading open-source software directly from the public sources or buying a packaged version from a trustworthy distributors is no riskier than buying e.g. Windows directly from Microsoft or a system integrator like IBM. If a consumer buys either open- or closed-source software from Bob's Back-Alley Software and Pawn Shop, well, it's a bad idea either way.

    Open-source is not the security panacea that some advocates make it out to be, but it doesn't incur the added risks which Mr. Jones attributes to it, either. A government or other user which applies common sense to its software acquisition is no more at risk from open-source software than closed-source, and may even be a bit safer.

    Respectfully,
    Eric Anderson

    --
    Eric Anderson - anderson@cs.uoregon.edu
    University of Oregon Network Security Research Lab
    PGP fingerprints:
    D3C5 D6FF EDED 9F1F C36D 53A3 74B7 53A6 3C74 5F12
    9544 C724 CAF3 DC63 8CAB 5F30 68AE 5C63 B282 2D79
  • by randall_burns ( 108052 ) <randall_burnsNO@SPAMhotmail.com> on Thursday February 12, 2004 @05:54PM (#8263556)
    I have worked in environments in which criminal gangs were quite active-specifically banks that process credit cards(www.outlander.com for my background).

    The claim that Open Source Projects are especially vulnerable to infiltration by folks with malicious intent strikes me as strange.

    We have large companies like Oracle and Microsoft extremely dependent upon technical help from politically volatile parts of the world(i.e. India/Pakistan where there was serious threat of nuclear war not long ago)--places where criminal terrorist organizations can operations they can't in a developed country. In India, there are for example tens of thousands of people that have been declared legally dead so someone can seize their property-and the victims can't clear up the issue years later.

    It isn't an issue of intent. Some overseas criminal organizations have a reputation for blackmailing their countrymen that don't want to participate in criminal activity-holding relatives as hostage.

    Can the average US company really do an effective background check in this kind of environment?

    With an open source project, at least I have a reasonable chance of understanding who the actually engineers of project are-and I can judge the security based on the reputations of the people involved. I _can_ get independent examination of the code involved if I'm willing to pay for the service.

    Large "US" companies have this habit of substituting the cheapest possible resources with no consideration of long term consequences. How much is the word of a Larry Ellison or Bill Gates really worth on the subject of security? Would you bet your life on their judgement?

  • by Lysol ( 11150 ) on Thursday February 12, 2004 @06:59PM (#8264329)
    This will happen because the open source model, which lets anyone modify source code and sell or distribute the results, virtually guarantees that someone, somewhere, will insert malicious code into the source.

    Of course you can get the source code and modify it. However, 99.9% of the time you cannot commit it back to the tree without first getting to know the guys running the project. And what usually comes first is submitting patches to the project via a project member (uaully a high-level member since some level of oversight and accountability is needed).

    Once that 'trial period' has passed, then a coder can usually check into the repository head. However, I don't see any major difference in that respect to someone working at [insert super software company here] and someone coming in and being a good person for a bit and then adding back doors to code.

    The author assumes that as soon as you get the repository login set up on yr machine, then you're just able to start fucking things up. This is highly unlikely and since that, in my view, is the most fundamental piece of team programming, I find his argument to be dead right there.

    As for distributing the results, that is also flawed but not by logic, but by market forces. Even if someone got a hold of the entire RedHat repository or Evolution for that matter, I don't think people would be using that product for a few reasons.
    1. Lacks credibility. Forks have enough time gaining intrest from the project they forked off. So why would someone want to fork something just to insert back doors and take over the world. Seems like an awful waste of time and effort. And just because you fork it, doesn't mean they'll come.
    2. Even if a 'malware' fork happened, it wouldn't stay afloat long. It would probably take less than a day for someone to figure out something was going down and to spread the word. Again, the OS community is the key here. You wouldn't see this happen behind closed doors.

    This guy lives in the fairytale land of spooks and secrets and bad guys around every corner. While I'm sure there's plenty of falling outs of people in various projects and groups, it's highly unlikely that any of these scenerios the author plays out will ever come true. In any ecosystem, only the strong will survive. And I just can't seem some 'malware' being released and taking over everything. In fact, all the worst case infections and money losers to date have all happened in the ActiveX/DevX/.NET/M$ propreitary, closed door, secret world. Of course this guy has this opinion. He exists in a world where everone is paranoid and everything not yours is evil or doomed to failure or ripe for punishing.

    Free your mind..
  • by dhall ( 1252 ) on Thursday February 12, 2004 @08:56PM (#8265412)
    http://www.lowth.com/alist/author/-/A%20Russell%20 Jones/1

    Mastering ASP.NET with VB.NET
    Mastering ASP.Net with Visual C#
    Visual Basic Developer's Guide to Asp and IIS .NET Programming 10-Minute Solutions

    Now, he may be serious with his accusations against open source. His message borders on the evangelical against open source software? A proprietary, Microsoft zealot, which is no better or worse than a rabid Linux Zealot?

    There's already a rebuttal editoral on Devx.com's main webpage by another Engineer there.

    http://www.devx.com/opensource/Article/20135

    Now as to whether this was some kind of publicity stunt to garner more traffic to their website, since before today I'd never heard of them... they've been quite successful. They've probably seen more traffic today than in quite a while, but it seems likes an infantile cry for attention.

    Why not? It's obviously that absurd and completely ridiculous claims can be made for public perusal (aka SCO) and gather quite a bit of the media spotlight. It's a precedent already set in our culture that favors glitz and glamor over substance.
  • the pay (Score:5, Insightful)

    by Tom ( 822 ) on Friday February 13, 2004 @03:01AM (#8267535) Homepage Journal
    "You get what you pay for."

    Flawed assumption: There is a direct relation between quality and price.

    Why is it wrong? Because in the real world, where some of us still live, many factors aside from quality influence the price. Here is a short list of some:

    * Quantity, lowering per-unit-prices
    * Price perceptions, i.e. brand vs. no-brand
    * Delivery, packaging and other overhead costs
    * Regulations, legal costs and other burned money
    * Intentional price modifications, i.e. dumping

    And then, of course, the entire logic only applies to things that are actually sold. Any math person knows that comparisons with zero are always dangerous. Quick, what's two times zero? Maybe we should just double the price for Linux, then (in his eyes) it becomes twice as good. :)

The sooner all the animals are extinct, the sooner we'll find their money. - Ed Bluestone

Working...