US DHS Testing FOSS Security 203
Stony Stevenson alerts us to a US Department of Homeland Security program in which subcontractors have been examining FOSS source code for security vulnerabilities. InformationWeek.com takes a glass-half-empty approach to reporting the story, saying that for FOSS code on average 1 line in 1000 contains a security bug. From the article: 'A total of 7,826 open source project defects have been fixed through the Homeland Security review, or one every two hours since it was launched in 2006 ...' ZDNet Australia prefers to emphasize those FOSS projects that fixed every reported bug, thus achieving a clean bill of health according to DHS. These include PHP, Perl, Python, Postfix, and Samba.
What about MS? (Score:5, Funny)
Re:What about MS? (Score:5, Interesting)
Actually, it would be really nice if it was possible to do it with Microsoft. Microsoft (or most other companies that produce proprietary software) certainly can't do better than what the open source projects do, and certainly their code contains at least as much issues as the ones found in open source projects.
The ability to do code audits has always been one great advantage of open source software, but until now, it was mostly in theory. Now we start to see big code audit projects such as this one, showing that the advantage is real and that the results of the audit are good, since some of the projects have alread patched all of the issues, and certainly most of others will finish patching them soon. This shows that open source is here to stay, is going mainstream, and will not be stopped by any company's interests.
All issues that currently exist on Microsoft's code, on the other hand, will be unpatched. Unless they hire some consultant company (why not the same?) to do the audit on their code (certainly under NDA). But you can be sure that, if they do, for one, they won't publish the results of how many issues were found. No transparency there. And also, probably many issues won't be fixed as promptly as all of them were fixed in many of the audited open source projects. This is not a speculation, if you only look at how long it takes for them to fix issues for which there are security vulnerability reports issued, then you realise that the ones only they know about will certainly take much longer.
Re: (Score:3, Informative)
Now, I realise it doesn't change your point at all, but its not like MS is the only entity with access to their own code: they have dedicated programs to share even their most closed pieces of code with their customers (if they're important enough).
Re: (Score:2, Insightful)
It also shows that open source has failed to use a common tool to self audit - it took a third party to do so.
Re:What about MS? (Score:5, Insightful)
Re:What about MS? (Score:5, Insightful)
It also shows that open source has failed to use a common tool to self audit - it took a third party to do so.
Since an audit is usually an independent review, I see it as only logical for it to have been done by a third party.
The point is, it is open. Anyone may perform an audit at any time they wish to do so.
And everyone apart from the developers themslves and the users of the software is third party, by definition.
Re: (Score:3, Insightful)
Re:What about MS? (Score:4, Insightful)
That is what open source is all about, anybody can contribute their worth while efforts to it. Contribution to open source not only includes code, it also includes, auditing as well as actual innovation and even those other activities like distribution, documentation, promotion and support.
So your illogical claim of failure is in reality open source success. I will never understand why closed source proprietary zealots just don't get it, I suppose it just goes to prove greed and stupidity really do go hand in hand ;).
Re:Wow... FOSS looks pretty pathetic (Score:5, Informative)
Yes, OSS has bugs. Everything from compilers to content management systems, surely. So do proprietary programs.
The more qualified eyes you get on a bug, the better chance you have of finding and fixing it. You can do that by having a big staff that pores over code again and again. You can do it by having lots of outside help, like in the case of popular OSS projects. One thing that helps is to have a fresh set of eyes look over something, which is much easier in OSS that in closed-source applications.
BusinessWeek had an article from a guy at Coverity back in 2006 about this. In that article [businessweek.com], Ben Chelf said that 4 of the top 15 programs on the quality scale measured by defects per thousand lines of code were OSS. He said that on average, the major-project OSS software they tested was indeed higher quality software than average. He said, though, that the absolute highest quality code was the cream-of-the-crop proprietary, closed source code from places that make things like fly-by-wire systems. Well, yeah. I'd want my airliner's fly-by-wire system completely bug-free, too.
Commercial software tends to harbor anywhere from 1 to 7 bugs per 1000 lines of code according to the National Cybersecurity Partnership's Working Group on the Software Lifecycle [zdnet.com]. Voluntary testing by Coverity requested (and probably paid for) by MySQL AB revealed that project to have all of 97 flaws, one of which could be a serious security issue. All 97 were to be fixed for the next release.
A similar study (same link) found 985 bugs in over 5,700,000 lines in the Linux kernel, or fewer than one bug per 10,000 lines of code. TFA has data on a newer version of the kernel -- 0.127 bugs per TLOC.
In Apache, 22 bugs total, 0.14 per TLOC, and three fixed so far.
PostgreSQL had 0.041 per TLOC, and have so far fixed 53 of the 90 bugs.
The glibc team fixed 83 of 83 bugs found.
OpenVPN had found one security-related bug in over 69,000 lines of code. As of later yesterday, it's officially security bug free according to the same testing people.
The list of officially security-bug free software [zdnet.com.au] includes Amanda, NTP, OpenPAM, OpenVPN, Overdose, Perl, PHP, Postfix, Python, Samba, and TCL.
So with Linux (0.127), glibc (0.000), Apache (0.140), PostgresSQL (0.041), Perl (0.024), PHP (0.000), and Python (0.000) powering a web server (numbers according to Coverity [coverity.com]), you have 0.0474 defects per thousand lines of code across the server. I'd say that's pretty good.
Re:Wow... FOSS looks pretty pathetic (Score:5, Informative)
I'd say your statistic is wrong. You need to multiply each average by the number of kloc per project (being careful to count those for the project version for which the averages were given), and then divide by the total kloc across all projects.
Re:Wow... FOSS looks pretty pathetic (Score:4, Interesting)
It's also unlikely that any real installation would have exactly those packages installed, BTW. Almost any installation will have packages from CPAN, PEAR, whatever Python's central repository is called, some extra stuff like syslog, logrotate, bash, and at least one text editor at the very minimum.
Let's be a little more accurate than multiplying by defects per thousand lines to make up for my previous late-night gaffe. Let's use the actual defect numbers of verified but unfixed and unverified defects.
Apache has 19 defects in 135,916 LOC.
glibc has 0 defects in 588,931 LOC.
Linux has 461 defects in 3,639,322 LOC.
Perl has 12 defects in 496,517 LOC.
PHP has 0 defects in 474,988 LOC.
PostgreSQL has 37 defects in 909,148 LOC.
Python has 0 defects in 282,444 LOC.
That's 6,527,266 LOC and 529 defects. That's 6527.266 TLOC. I get 0.081 defects per TLOC. That's still pretty damn good.
As I said, there's probably some other software on that server, but it starts from a pretty strong base.
PHP - no security bugs! (Score:2, Funny)
This is because the security problems with PHP aren't bugs, they designed it that way.
Re: (Score:3, Interesting)
Every file on a Windows box has execute permission set. This appears to be a designed behaviour of Windows. If you do not perform a chmod on it after upload, it keeps its execute bit. This is entirely to be expected, and any oth
Rabid MS hater? (Score:3, Informative)
Where did you pull the 1% of OSS users being programmers from? Your ass? You didn't even cite your own ass? How rude!
Yeah, there aren't enough world-class programmers to go around the millions of OSS projects out there, or even the most popular hundred thousand of them. Maybe not the ten thousand most popular. Yet over half the patches for the Linux kernel come from people other than the core development
Fixed? (Score:5, Funny)
Do they mean fixed [wikipedia.org] or fixed [wikipedia.org]?
Looking good, too bad the press didn't understand (Score:5, Insightful)
You can't ever say that proprietary software is secure, because there's no way to prove it. With Open Source, you can come a lot closer to proving that it is secure, because you can employ every security test that exists.
The fact that a coverity scanner bug is reported doesn't mean it's an exploitable security flaw.
Bruce
Re:Looking good, too bad the press didn't understa (Score:5, Insightful)
I'm a software security engineer. I can look at source code and tell you if it has some bugs in it that I would consider relevant to security. If I can't find any, I might tell you that it is more secure than if I could... but that's doesn't mean it is secure. I'll never tell you it is secure, because testing simply can't give you that. I can do this on proprietary software or I can do this on Open Source software.. the only difference is that, with the Open Source software, I don't need permission from someone to do the testing and other people don't need permission to check my work.
Does this mean that more people will check the Open Source software for security flaws? Not necessarily. It completely depends on whether or not someone has an interest in the security of that particular bit of software. Even assuming a similar level of interest in the security of comparable proprietary and Open Source software, there's no guarantee that those who have an interest in testing the Open Source software for security flaws will report back the findings. They may simply decide that the Open Source software is too insecure for their use and go with the proprietary solution - assuming they can have it similarly tested by a trusted third party.
All in all, the assumption that Open Source software is more secure than proprietary software is most likely true, but there's no hard data.. because the stats on the insecurity of proprietary software are guarded secrets - and that's probably the best reason to assume that proprietary software is less secure.
Re:Looking good, too bad the press didn't understa (Score:5, Insightful)
I submit that people who are only looking for security flaws don't have a motivation to develop a deep understanding of the software. People who are out to modify the software do. And thus there are not just more eyes, but better eyes with Free Software.
There is a class of mathematically provable software languages, and you might be able to say with surety that programs in them are secure. For the languages we usually use, you can only say that you have tested them in the ways you know of. And only a person with access to the source can say that. If you want an independent asessment, Open Source software won't stop one from happening, and won't hinder what can be said with NDAs. That's why I think it's more secure.
Bruce
Re: (Score:3, Insightful)
I submit that people who are only looking for security flaws don't have a motivation to develop a deep understanding of the software. People who are out to modify the software do. And thus there are not just more eyes, but better eyes with Free Software.
No offense, but that's completely the opposite of the facts. The vast majority of software engineers have no idea what they're doing when it comes to detecting, fixing and avoiding security issues. That's why tools like Coverity exist - and most the time the programmers can't even use them correctly. There are "security consultants" you can hire who basically just explain the results from Coverity, and they're not short on work.
But hey, don't take my word for it.. go have a chat with your friend Theo de
Re: (Score:2)
Ouch... Talk about throwing to the wolves. However, if you want to have a well-informed (albeit somewhat lacking in social graces) person to comment on the state of security in general, he'd probably be quite a good choice. As to the state of security sense/awareness in programmers, he'd probably be one of the best
Re: (Score:3, Insightful)
It's possible to prove almost anything about the programs and operating systems, from type safety and runtime guarantees to any arbitrary set of predicates you want the system to satisfy. That assumes
Re: (Score:2)
I don't really see how it's irrelevant - if a "security defect" exists but cannot be exploited (i.e. if there's a buffer overflow bug but it deals with internal data or data that's already been thoroughly sanitized), it does not present the same risk as a bug that may be easily exploited, for example in the input sanitizing code. It's not really clear how many of these bugs are of each type, and I think it's significant tha
Re: (Score:2)
Re: (Score:3, Interesting)
Willing to stipulate for the purpose of this discussion.
However, there were dozens of Windows viruses and hundreds of thousands of compromised machines, and zero MacOS viruses.
Likewise willing to stipulate.
Thus, while a certain measure of vulnerability is comparable, the likelihood of actually being attacked is infinitely highder with Windows.
I would suggest this doesn't necessarily follow. It could
Re: (Score:2)
As for the use of terminology, excuse me for using an accurate term like "defect" instead of a more popular colloquialism like "hole".
RTFA (Score:5, Informative)
Mod parent up! (Score:2)
Pessimism in article (Score:5, Informative)
Not only did the article say much like its commercial counterpart, but most of the numbers it shows are actually good for open source software.
For instance, most of the projects discussed had less than 1 bug for 1000 lines of code. For instance, the Linux kernel had .127 bugs per 1000 lines, and that on over 3 million lines of code.
Also, the article talks about key projects, such as the glibc (which is basically used by everything on a Linux system) that already fixed all the issues.
Even something huge and complex as Firefox has already fixed half of the issues, and is showing progress on the rest of them (by the fact that some were already verified).
Overall, I didn't get the half glass empty tone that the summary is implying. And what I found strange is that even the comments on the site itself, and many of them on /. itself, are also taking the pessimistic view.
I thought that this news are great for open source software. Shows that it has less security issues than average, that the issues are fixed quickly, and still that some programs are certified by a company for use in security related departments such as the DHS. What could be better than that?
Re: (Score:2)
Actually, the first line of the article reads "Open source code, much like its commercial counterpart, tends to contain one security exposure for every 1,000 lines of code, according to a program launched by the Department of Homeland Security to review and tighten up open source code's security."
The problem is is how do they know how many lines of code are in the closed commercial programs if they can't see the code?
FalconRe: (Score:2)
Gee good point! How could they possibly view the code if it's not open to all? I mean, it's not as if there's any possibility they could've gotten a bunch of companies to agree to let them audit their code provided they only released the results in aggregate, without any identifying information.
Just because it's not open source doesn't mean that nobody is ever able to gain access to it.
how many lines of code? (Score:2)
In many cases they can see the code, albeit under pretty restrictive NDAs.
Which is why I consider open source more secure, anyone can find a hole and anyone, well programmers at least, can fix the hole. With closed source code a review can be restricted from informing users of said code the problems it has.
FalconRe:Looking good, too bad the press didn't understa (Score:3)
Re: (Score:2)
Yes they said that, but you don't really believe it, do you? If so, just look up "security by obscurity" and read about it. To give you a clue, the unavailability of source has not prevented 100,000 Windows viruses.
Re: (Score:2)
Re:Looking good, too bad the press didn't understa (Score:4, Funny)
A more apt analogy would be: There's no point in locking your door using a limp spaghetti noodle because a limp noodle makes a completely ineffective lock.
Re: (Score:2)
Re: (Score:2)
It's barely a speed bump for the evil hackers who feed garbage into programs to crash them, then poke around in a debugger to find where they broke, then write some machine code to take advantage of the bugs. Thinking that lack of access to source code is anything like a "lock" is just self delusion.
Re: (Score:2)
Re: (Score:2)
Something cannot be secure without obscurity.
It's like the physics concept: Observation is Interaction.
Re: (Score:2)
Bruce
Re: (Score:3, Insightful)
Even those who historically have critized "security through obscurity" never suggested that publishing their design or secrets would lead to better security, but rather that you can't assume your that your design can't be cracked.
Of course, the preferred approach is "security through design" which has nothing to do with correcting bugs. The latter could be called "security through maintenence". Thus while we might argue about whether closed
Re: (Score:2)
You're wrong about that. For example, NIST, a US government standards agency, is calling for proposals for a new cryptographic algorithm for government use. Their specification [nist.gov] requires that it be publicly disclosed (and royalty free, too). This is so that they don't pick a weak algorithm. They want any known or theoretical problems to be pointed out to
Re: (Score:2)
In this post-911 period I've seen a trend toward more secrecy rather than less. For example, the documents that described the military's UHF DAMA waveforms used to be freely available on the Internet, but they aren't now.
Re: (Score:3, Informative)
An IT Security article on full disclosure [itsecurity.com] states that as early as the middle of the 19th century locksmith Alfred C. Hobbes thought full disclosure was important to clear up the rash of lock picking people were experiencing. It goes on to discuss exactly why full disclosure works so well.
David Wagner says in an article on security: "Today, many [berkeley.edu]
Re: (Score:2)
Re: (Score:2)
Yeah, lots of software gets written in assembly, C or C++ that probably should be in something else. No, nothing else is able to take their place for everything just yet.
Re: (Score:2)
I think your logic is a bit confused. The fact that viruses can be created without reading the source code does not prove that there's no value in keeping the code secret. It's like arguing that there's no point in locking your door because 100,000 houses with locks were broken into.
Fact is anybody can dis-assemble a lock. And of course people can dis-assemble code.
Not too many people would be interested in breaking into a lock on a door (smashing a Window to get into the house is most generally used by non-government intruders).
The greatest value in keeping code secret is making sure it cannot be easily re-produced, and thus subverting other individuals or companies from using it without authorization. It's much like music and DRM: in the end it is the licenses which are enforced in
Physical locks and security by obscurity (Score:2)
Anyone can buy a re-key kit for Schlage locks at the Home Depot. Upon opening the cylinder of the lock with that kit, you will discover that (this is approximate, I don't have the lock in front of me) there are 5 pins, and 5 possible levels per pin, and that the minimum number of possible key patterns might thus be 5 ^ 5 or 3125. Which is enough that nobody's carrying all of the possible keys around and will have
Re: (Score:2)
There are probably few people who have even read every line of the Linux kernel - imagine trying to dis-assemble Vista looking for vulnerabilities.
"The greatest value in keeping code secret is making sure it cannot be easily re-produced, and thus subverting other individuals or companies from using it without authorization."
Perhaps, but your statement says nothing about security issues.
"Yes there may be value in keeping cod
Re: (Score:2)
Nobody can reasonably argue that not having the source code makes it easier to create exploits.
The point is that if there are exploits, it can be easier to fix them. Of course it doesn't guarantee they will be fixed (as in closed-source software), but the opportunity is there for global assistance and peer review (of the code and the fix) that is not available in closed-source software.
As well, open-source software makes it easier to find built-in vulnerabilities (like the Jap proxy software was found to have a secret back door for the German police by examining it's source).
Re: (Score:2)
Re: (Score:2)
According to McAfee recently (http://yro.slashdot.org/article.pl?sid=08/01/05/0215201) and Microsoft et al, having your code exposed lets the bad guys exploit it's vulnerabilities
Yes they said that, but you don't really believe it, do you? If so, just look up "security by obscurity" and read about it. To give you a clue, the unavailability of source has not prevented 100,000 Windows viruses.
No I do not believe it. I was just pointing out some (IMHO) rather lame and biased arguments. Openness and transparency (whether it be in software, business models, or just dealing with one's spouse, for example) is generally better than keeping things hidden.
Make the licenses as restrictive as you please, but at least give people the opportunity to know what they are using. Like listing ingredients on processed food, it's good to know that I'm not consuming something that could possibly do me harm (or be
Re:Looking good, too bad the press didn't understa (Score:5, Informative)
Indeed. FTFA:
One can only speculate about the, er, source of their discomfort.... 8^)
1 per 1000 lines is even more impressive as an average across all 180 FOSS applications tested. Most impressive of all are the highlights:
Even some of those with more bugs have at least responded well:
And my favourite 'backslider' of all, OpenVPN, has yet to fix 100% of the bugs found during this exercise. Of course, that's only 1 bug in over 69,000 lines of code....
These results should be viewed as excellent, by and large. This doesn't mean all this software is bug-free, just that there aren't a lot of easily preventable bugs in the code base. Most encouraging, though, is how fast they got addressed and fixed by the healthier FOSS projects.
Re: (Score:2)
Less encouraging is that they existed in the first place - doubly so since all the software you list is more-or-less 'mature'.
Re: (Score:2)
Re:Looking good, too bad the press didn't understa (Score:5, Informative)
strcpy is NOT insecure. It can be used insecurely.
But congratulations, you've just turned what could have been a borderline ok strcpy(src, dst) (ought to have been criticized at code review as the names of the variables are confusing) bit of code into (probably) a crash and definitely a buffer overrun if sizeof dst is larger than sizeof src.
I have lost count of the number of bugs I've had to fix after someone changed a perfectly good strcpy into strncpy. A common mistake is:
strcpy(dst, src);
becomes
strncpy(dst, src, sizeof dst);
and then you get a bug because only the first four characters of src appear in dst followed by garbage.
Of course, then it gets changed to
strncpy(dst, src, strlen(src));
because the original programmer did know what they were doing and the buffer was big enough.
Eventually we get to the brilliant:
strncpy(dst, src, strlen(src)+1);
Fantastic! What an improvement! And yes, it really does happen in what was once good production code because some idiot has heard that "strcpy is insecure".
Another one I've seen is:
dst = malloc(1000000);
strcpy(dst, "MESSAGE");
gets changed to
dst = malloc(1000000);
strncpy(dst, "MESSAGE", 1000000);
Yup, instead of writing 8 bytes, we'll write one million bytes because strcpy is insecure, but we won't fix the missing check for NULL. (there's a fairly good argument for not checking the return from malloc in much production code - if malloc actually fails then you're already so far up shit creek without a paddle that it's probably impossible to recover gracefully anyway. Obviously different considerations will apply if you're controlling a nuclear power plant than if you're writing a game)
strncpy is NOT a replacement for strcpy with a length parameter. Unfortunately strncpy has a very bad name, it should be called something like meminit_from_str() as strncpy ALWAYS writes n bytes and doesn't always write a null terminator. (I've also had to fix bugs where someone has replaced a correct use of strncpy with a version that guarantees to write the null)
strncat is a possibly safer replacement for strcat. However, the length parameter is so tricky to get right that I've seen cases where someone originally wrote strcat safely, that got changed to strncat "because it's safer" and then a bit later another change was made that caused a crash because the original change to strncat got the length parameter wrong.
extern char error_msg[][40];
char error[64];
strcpy(error, "ERROR");
strcat(error, error_msg[e]);
becomes
strncpy(error, "ERROR:", sizeof error);
strncat(error, error_msg[e], sizeof error - 6);
becomes
strncpy(error, get_translation("ERROR:", lang), sizeof error);
strncat(error, translated_error_msg(e, lang), sizeof error - strlen(error));
of course, even more common is to miss the -6 or strlen(error) completely than to remember the extra -1 that is required on the length parameter.
(The man pages are IMO, confusing for strncat as they usually say something along the lines of "appends at most n characters")
Tim.
strcpy, strncpy, strlcpy, etc (Score:2, Informative)
see <http://www.courtesan.com/todd/papers/strlcpy.html>.
Re:Looking good, too bad the press didn't understa (Score:2)
Oh man... Bruce Perens. What a pleasure. I couldn't have said it better myself.
(Actually, I was going to make fun of proprietary software for the general idea of having source unavailable).
More to the point though, I received a lecture on this [typepad.com] in a Software Architecture course a couple years ago and it struck a nerve. Even if you never need to review 99.9% of the code you run, it is nice to be able to look through the 0.1% that might be helpful for you to gain a better understanding of what is going
Re: (Score:2)
http://www.linuxtoday.com/developer/2006031800826OSCYDV [linuxtoday.com]
http://www.freebsd.org/doc/en_US.ISO8859-1/articles/committers-guide/coverity.html [freebsd.org]
Worst of all, these articles haven't disclosed the classes of software issues detected. I'm sure huge classes of deadlocks and other system-wide issues go undetected. Even if the point of Coverity is to conduct system-wide analysis, I'd still say lar
Re: (Score:3, Informative)
Re: (Score:2)
Must be run by Engineers... (Score:5, Funny)
The Actual Scan Site (Score:2, Informative)
Wow important stuff (Score:4, Funny)
Damn we better protect ourselves from Terrists hiding their WMD's in ASCI art
I wouldn't sweat it. (Score:2)
False positives (Score:3, Interesting)
Re: (Score:3, Informative)
Well... (Score:5, Insightful)
Re: (Score:2)
Of course, you realize you are not setting the bar very high...
This seems like a genuinely useful activity for (Score:2)
DHS, certainly more valuable than x-raying my shoes and confiscating my saline solution.
What would be more valuable would be to get rid of DHS.
FalconL, A and P, but where's M? (Score:5, Interesting)
The popular MySQL open source database was not included in the scans for reasons that were not immediately evident.
Any suggestions as to why MySQL has no results? I'm stumped and wondering why one whole corner of a LAMP foundation was left unchecked.
Re: (Score:2)
Scanners can have bugs too. Maybe feeding the MySQL source code into it caused it to error or crash for whatever reason.
Or maybe licensing issues? Although I doubt it, IIRC MySQL is GPL or something.
Re: (Score:3, Informative)
MySQL uses Coverity and Klockwork [mysql.com] on their certified versions on several different platforms. The certified versions are based on the major releases of community versions, and are typically just more conservative in that they only make changes for critical and security bugs [livejournal.com].
There's speculation that the community edition tested was actually an old report without a retest even back then, as the certified version based on that community version had zero d
Re: (Score:2)
some notes on the article (Score:5, Interesting)
My experience is with the C/C++ version of tool. We have also been evaluating the java version of the tool and it is good. But some of the free alternatives like findbugs are still better. I would use findbugs w/ prevent for java if I wanted good coverage.
Well THAT IS TOTALLY COOL! (Score:2, Interesting)
We get all the bug fixes, and it will become that much more robust.
Too bad that Windows will never get this kind of review.
It probibly has a few less bugs per line,
but not much hope of getting those fixed.
On second thought, Mr Allen, I challenge you to compare!
I am willing to bet that FOSS software,
just because of its nature of peer review,
and from my experence of reading ALen Cox's work on the Kernel,
that it has less bugs than Windows.
Bias in results (Score:2)
IIRC, Coverity is the commercialization of the "Stanford Checker" static analysis tool. By most accounts, it's a pretty nifty tool. Back when it was still a research project, some of the folks working on it would run it against different parts of the kernel and post the bugs it found to LKML. It was a mutually beneficial relationship--the kernel people fixed a lot of bugs, and the Stanford folks got analysi
Re: (Score:3, Insightful)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Good point. It's too bad they can't do both.
Homeland Security (Score:2)
I guess this benefits open source software since any bug fix is a good thing, but why on earth would the department of homeland security be studying software. shouldn't they be worrying about things like preventing biological attacks or improving how they handle natural disasters?
Such as hurricanes?
FalconRe: (Score:2)
Re: (Score:2)
Actually I seem to recall DHS paid a CIA analyst to do an impact study on global warming and climate change. Instead of looking at where temperatures are changing (which scientists are concerned about) the question posed was what will these changes mean. The result was that cities are not that robust. More than hurricanes, but I'm sure they were a part of it...
DHS did another one of those studies? The Department of Defense did one years ago. And it didn't paint a pretty picture.
Falcon
Re: (Score:2)
Re: (Score:2)
I guess this benefits open source software since any bug fix is a good thing, but why on earth would the department of homeland security be studying software. shouldn't they be worrying about things like preventing biological attacks or improving how they handle natural disasters?
Software is used in important areas that are vital to a country's basic infrastructure and operation, such as power plants (nuclear or otherwise), radio and television stations, cellular phones, the Internet, banking, etc. I think the Internet would be one of the most important as it is a major source of commerce and communication.
Example:
Matthew Kovar, a senior analyst at the market research firm Yankee Group, generated some publicity when he told reporters the attacks caused USD $1.2 billion in global economic damages.
- http://en.wikipedia.org/wiki/MafiaBoy [wikipedia.org]
The attack was aimed at DNS root servers. Since that time the router's software has been upgraded to prevent such wide-scale damag
Re:"The" PHP? (Score:4, Informative)
Re:"The" PHP? (Score:5, Funny)
How could he possibly know that? He said already that he stopped reading after 'the PHP'.
Indeed (Score:2)
Re: (Score:3, Funny)
So close. Lets turn those into a proper Tcl list, shall we...
set thislist {Samba} {the PHP} {Perl} {Tcl dynamic languages} {Amanda}No, I think he's deliberately speaking with a LISP.... 8^)
Re: (Score:2)
Re: (Score:3, Funny)
Security and computer science as explained by a valley girl?
Like totally!
Re: (Score:2)
Re: (Score:2, Funny)
Re: (Score:2)
please become an hero.
A slight twist.... (Score:2)
Re: (Score:3, Insightful)
a) it is too restrictive, and would disqualify the GPL as free software. Remember, that the GPL is a distribution license, not a list of restrictions. You should be able to talk to other people (even publicly) about software without contacting the maintainer first. The behavior you describe is responsible, and generally recommended, but should not be forced.
b) as you have it worded, if the restrictions were followed, it would enable a maintainer to prevent anyone
publicly speaking about software (Score:2)
c) It is not enforceable in most jurisdictions. In the US, and I assume most of the "free world", you can't prevent someone from talking about your products publicly. You can have them sign an NDA, but that doesn't work for publicly available software. McAfee tried something like this some time ago, stipulating in the EULA that you can't benchmark their software. It got shot down in court.
I haven't seen one in years but doesn't Microsoft's ULA have a clause that you can't publish a review of the software
Re: (Score:3, Insightful)
Not all bugs are easily reproducible - and not all bugs are found by tripping over them. Conside
Re: (Score:2)