Tech Companies Ask U.S. to Regulate Cyber Security 371
qtp writes "Wired reports that a group called the National Cyber Security Partnership, which consists of 'leading software companies' including Microsoft and Computer Associates and industry organisations such as the BSA, has asked the Department of Homeland Security to regulate what they call 'Cyber Security'. Representatives from Microsoft, Computer Associates, and the BSA headed the Security Across the Software Development Cycle Task Force that submitted this report to the Bush administration today. (For all of you who dread reading 123 page reports, there is a three page summary available as well. The Washington Post, Forbes, and Other Sources are covering this story as well. I hope this is just another [late] April Fools Day joke, but I'm afraid that this looks too scary to be real."
Smells like a replay of the AT&T monopoly (Score:5, Interesting)
This smells to me of the same process. Being sued for security holes would be much more effective at increasing security than some hare-brained government regulation scheme. After having thought up all those EULAs which disclaim all responsibility, and blustered about Linux having no-one responsible, this is just another big corporate scheme to maintain their power and squash the small guys, and place the blame elsewhere.
The proper way to improve security is invalidate all those EULA disclaimers. A few big lawsuits with billions in damage verdicts would do far more to focus Microsoft's attention than any government regulatory body.
Re:Smells like a replay of the AT&T monopoly (Score:2, Interesting)
Not what I said or think (Score:5, Insightful)
Whereas open source fixes the problems without blaming others.
Re:Smells like a replay of the AT&T monopoly (Score:5, Interesting)
You make a good point about affecting large corporations wiht lawsuits, but who gets sued when my linux server gets hacked?
In order to claim damages in such a lawsuit you would have to prove that the company in question knew about a vulnerablilty and didn't fix it. Therefore showing negligence on the part of the company.
To apply this to OSS you first need to distinguish between free and Free software. If the linux distro you were using was Open Source but commercial, meaning you paid money for it, making it Free (as in speech) but not free (as in beer) then the same rules would apply. They would be responsible for damages if they knew about a vulnerablility but didn't patch it.
If the software was free (as in beer) then the developers shouldn't be held responsible for any flaws in the software. There is no contract between you and them, they have not promised you anything by allowing you to use their software free of charge.
By making this distinction you make commercial OSS software developers equally liable for negligence without opening up small OSS projects to litigation they have no chance of surviving.
This is of course all hypothetical as at the moment no software companies accept any responsibilty for flaws in their software. And of course IANAL.
Re:Smells like a replay of the AT&T monopoly (Score:3, Insightful)
Who gets sued when my Windows server gets hacked? Microsoft, it its EULA disclaims all responsibility, so you can't sue them either. I find it strange that Microsoft's selling strategy is "you can sue us!" Especially since you can't, usually.
Re:Smells like a replay of the AT&T monopoly (Score:3, Interesting)
The proper way to improve security is invalidate all those EULA disclaimers. A few big lawsuits with billions in damage verdicts would do far more to focus Microsoft's attention than any government regulatory body.
Yeah, that will make a lot companies/independent coders want to release code. Imagine not releasing code until you are positive there are no exploits or holes in your code. I don't see too many claims of *cough* unbreakable software going around save for Oracle.
Re:Smells like a replay of the AT&T monopoly (Score:5, Insightful)
You've noticed the same kinds of disclaimers on the GPL, yes? If the warranty disclaimer on a Microsoft license is invalid, what makes the one on the GPL valid; and if it is not, then how would, say, the contributors to the Linux kernel fare if they were sued for a major security breach?
FLOSS developers don't point fingers (Score:2)
Wrong Comparison (Score:5, Informative)
Have you also noticed how GPL'ed products are free (as in speech, but also, often, as in beer).
Notice how EULA does NOT usually cover things for which you have access to source code?
The point is simple - when you BUY software, the software VENDOR should carry responsibility.
GPL'ed software is given away - no money is charged. Thus, the GPL can say "we're just doing this for fun, use at your own risk"
In contrast, paying money and accepting the license as part of the transaction makes it a contract. The contractor should be held responsible for his work.
(I know, IANAL, playing fast/loose with the term ``contract'', etc.. But the chief distinction is MONEY)
Re:Wrong Comparison (Score:3, Insightful)
Two different types of licenses entirely.
This is one reason why EULA validity is greatly contested (ie: UCITA etc...) whereas the GPL has been largely unchallenged in legal authority since it was created. (way before UCITA or DMCA, etc...)
IMHO shrinkwrap/end-
Re:Smells like a replay of the AT&T monopoly (Score:3, Insightful)
Re:Smells like a replay of the AT&T monopoly (Score:3, Insightful)
It's the sue-based-on-damages mentality that leads to people expecting to get rich based on doing something stupid. It's not the $10M responbility of a car company that you waited until the last day to cash in a lottery ticket and then when the car failed to start, lost the chance. There are services
Re:Smells like a replay of the AT&T monopoly (Score:3, Insightful)
Re:Smells like a replay of the AT&T monopoly (Score:3, Insightful)
Because people are actively trying to break the software (crackers, hackers...) to either gain access or knowledge. That's why the "hood is welded shut," to use a crappy premade analogy. On a car, however, all getting under the hood requires is a crowbar (or access to the cabin). At that point, you can start ripping out wires & stuff or
Re:Graaah! (Score:3)
Re:Graaah! (Score:3, Insightful)
The GPL, at each link, prevents handing over liability to the next level. So, generally, each company who distributes a GPLed program is liable. This, nicely, also fits well if companies become the main provider of GP
Re:Smells like a replay of the AT&T monopoly (Score:5, Insightful)
From the report, I gather they want to define security and then they can make sure they meet that definition. Make the rules and play by them, at least in legal terms.
The summary talks about a taskforce to develope "metrics", working with government agencies and get a thumbs-up, develope industry standards, have awards for secure software (can open-source software win?), create a security license accredation program, and make "the security of one's software a job performance factor."
Excerpts of congressional Hearing (Score:4, Funny)
Congressional Hearing
Bill Frist Testimony...
Now we will elect a new Security Head - a strong Chancellor. One who will not let this tragedy continue.
Bill Gates: Mr. President - Members of Congress, if I am elected, I promise to put an end to this CyberTerrorism..."
Later (to steve Ballmer) I have the Senate bogged down in procedures. They will have no choice but to accept your control of the system.
Much later, in Seat..(an undisclosed location)
Steve Ballmer: I bring you good news, my Lord. The war has begun.
Darth. . er Bill Gates: Excellent. Everything is going as planned.
Re:Smells like a replay of the AT&T monopoly (Score:2)
I would think the only thing left would be freely-traded software where the original source is very carefully hidden so it is impossible to locate who to sue, and VERY expensive software from companies that buy VERY expensive insurance policies.
The free software would certainly be almost 100% of what is run anywhere and would include source code, but it would be a very str
Re:Smells like a replay of the AT&T monopoly (Score:2, Troll)
Yes, it seems that a great many people are ignorant to the true effects of government regulation. Government regulation usually works to enshrine monopoly power by increasing the barriers to entry to competition. It is often sold politically as us against the big corporations, but fundamentally government regulation is designed to give people less choice. Established and wealthy companies can better handle regulations therefore
Re:Smells like a replay of the AT&T monopoly (Score:3, Insightful)
Of course, in a complete laizzes faire system, dirty tricks and irrational consumer choic
Re:Smells like a replay of the AT&T monopoly (Score:4, Funny)
Repeat after me
In
yeah, I know -1 troll, -1 offtopic, whatever
interestingly (Score:3, Informative)
Re:interestingly (Score:3, Interesting)
Nothing more fun than having a bank examiner talk to you about network security - when they don't know much about it.
Two scariest lines you'll ever hear. (Score:5, Funny)
Re:Two scariest lines you'll ever hear. (Score:4, Funny)
"Windows has discovered new hardware"
Re:Two scariest lines you'll ever hear. (Score:4, Funny)
You forgot one:
"Windows has discovered new hardware"
No, no, the scariest one is:
Windows has detected: "unknown device", and is installing drivers for it
Re:Two scariest lines you'll ever hear. (Score:5, Funny)
4) "You might feel a little discomfort."
Re:Two scariest lines you'll ever hear. (Score:3, Funny)
Re:Two scariest lines you'll ever hear. (Score:2, Funny)
at a Rolling Stones concert.
Business bastards.. (Score:5, Insightful)
Business gets .gov to regulate security.
Regulation and "Approved By.." nonsense costs money.
MS, et al pay.
Open Source can't pay.
Non-approved things can't be used, ergo closed source wins.
Re:Business bastards.. (Score:3, Insightful)
Re:Business bastards.. (Score:2)
Yup (Score:5, Insightful)
I'm all for the government issuing advisories, but regulation of security is not feasible. I remember reading about older military software -- the government used to try to do much more comprehensive security reviews of all kinds of software it used with tiger teams. Unfortunately, it turned out the extreme expense of this kind of thing isn't feasible in the real world, and still left holes.
If I had to give a government recommendation, it would probably be along the lines of:
* Issue advisiories. There are organizations like CERT that do this. Unbiased (not from a vendor), trustworthy information is difficult to come by.
* Issue best-practices papers. These are probably most useful to IT professionals, though it might even be a good idea to produce them for software developers. Microsoft recently collaborated with the Fed to produce a set of best security practicees documents for Windows. This is an easy thing to add to a company security policy ("[] must comply with USG Document #135F3 Best Practices"). It just tried to deal with a couple of common misconfigurations. It's *hard* to get this kind of stuff directly from a vendor (which frequently wants to hand out information that will encourage you to buy more or is more interested in putting a positive spin on their mistakes) or a consultant (who frequently wants you to buy more consulting services) or a security software (like a firewall) company, which is primarily interested in scaring companies into thinking that they need security software.
* Government certification of software intended for non-government use is a bad idea. It takes a long time, allows cronyism, can be used to attack some sections of the market (like most Open Source). It's perfectly reasonable for USG-use purchase requirements, but it's not reasonable for broader use.
* Producing a classification system *could* be very useful, where the government writes documents describing particular classes of software, but it not responsible for ensuring that a particular version of a program fits into a class of software. For example, a hypothetical class-local/1 might require that:
a) The software bounds-checks all memory accesses to data at the compiler level (free with some languages like Java, and can be done in C if necessary).
b) The software does not access the network.
c) The software does not write to any data files.
Others useful requirements for various classes of software might be: "The software does not provide privilege escalation within the UNIX operating system's privilege system (as a suid/sgid program or a daemon running as a different user does...there would be an equivalent for the Windows security system)", "All data that the software uses from the network is either exact-match checked or bounds-checked prior to use of any of that data, and a failure to pass checks results in that data not being used" (might be useful for simple network software, like clients of the daytime protocol). The government is great at writing requirements and making them publically available--let's use that. Then, if a company guarantees that they are compliant to a particular document in a contract, there is a clear point that they can be called on for non-compliance. Finally, there would be a market for software that can check software for some elements of compliance. Automated security checking is a major issue -- it's neat, it's more and more feasible (see CMU's Java proof-carrying compiler [cmu.edu] for some neat stuff. The problem is that there are currently no standards written by security folks who know what they're doing, so it's hard for businesses to ask for compliance to a particular level of security, and no tools that can certify programs to a particular level.
There are probably a lot more suggestions that the government could use, but this is a start...
This is an ITAA group (Score:5, Informative)
ITAA is the lobbying arm of high tech corporations.
For insight on how ITAA sets up these "blue ribbon panels", read this article [thoughtcrimes.org] about a meeting of electronic voting manufacturers. They brought in Harris Miller, ITAA's president, to see how he could help them.
Highlights from the article:
"Similarly, when we get press calls and the press says 'Joe Academic says your industry's full of crap and doesn't know what it is doing.' What do you say Harris? The reporters always want to know what are the companies saying?.. And there can be two scenarios there: The companies may want to hide behind me, they don't want to say anything... frequently that happens in a trade association, you don't want to talk about the issues as individual companies.
How is any of that related to the topic at hand? These panels we see approaching the government are coalitions formed by a lobbying firm that is paid to protect the interests of its clients. The panels are made to look as if they are unbiased experts that are only looking out for the good of all Americans. The truth is they want to control the conversation so it seems as if they are the only ones with relevant information on the subject at hand.
Harris Miller and the ITAA have been doing this for many years, and their MO is always the same. This The National Cyber Security Partnership is nothing more than an extension of ITAA's lobbying efforts.
displacedtechies.com [htpp]
Re:Business bastards.. (Score:3, Insightful)
For example: It costs $90 to register a corporation (in my state) and $15 annually to maintain that registration. No matter if you have earnings over $1B or just over $100. There is no favoratism, and concern is shown for the smaller low income company.
Even in the article the author of the report cites concern for open source: "We need to better unde
is it just me.... (Score:3, Interesting)
I can see it now (Score:5, Insightful)
Re:I can see it now (Score:2)
Re:I can see it now (Score:4, Insightful)
Re:I can see it now (Score:2)
I didn't say I was afraid, I just said they will make that point. Besides, FUD generated by SCO did not help the open source movement.
"Security through obscurity may not be reliable but it is at least a useful barrier to sort out opportunists and the unskilled but predatory."
Then why are Outlook based viruses so popular?
Re:I can see it now (Score:2)
Re:I can see it now (Score:5, Insightful)
Are you insane, stupid, or just a troll?
TCP/IP is not itself intrinsically insecure. TCP/IP has proven to be reliable, flexible, and *very* secure, if used appropriately. (That is, if security is an issue and man-in-the-middle attacks are a concern, use appropriate cryptographic techniques to secure and authenticate your communication.)
The MS-Outlook exploits are based on stupid decisions in the design process. Until Microsoft built a mail client, it was a truism that email was not a carrier of viruses. The arbitrary execution of untrusted code is the root cause of MS-Outlook exploits, *not* some imaginary issue with TCP/IP. In fact, it doesn't matter whether the email is delivered via IPX, NetBUEI, or TCP/IP. MS-Outlook is insecure.
On the web, IIS has proven to be significantly less secure than Apache; and since Apache accounts for over 65% of all web server installs, and the source code is available, it seems a more likely target for virus writers.
As far as the "print the lock diagram on the door" concept goes: I don't care. The concepts and principles of lock building are available to any thief. If your lock is so poorly-designed that a diagram printed on the door will offer compromise, then an able thief will be able to get past it without the diagram. Anyone who doesn't know much about locks won't be able to make use of the information anyway. At most, it will provide a starting point for education.
Yes, you can only build a solid house on a solid foundation; but nothing stops you from building a poor house on a solid foundation, either. In fact, I guarantee that if you are ignorant of construction principles and are unschooled in the use of the appropriate tools, you *will* build a poor house, no matter the quality of the foundation. And if the architect designed an unsafe house, you will build an unsafe house no matter how handy you are with the tools.
Re:I can see it now (Score:3, Insightful)
Bad example for a flawed argument. Knowing the mechanisms in a lock is not what makes them difficult to exploit (most are not), as most designs for the vast majority of locking devices are readily available, or are easily determinable through trial and error (yes, IAALS). Some locks, such as those manufactured by Medico) are extremely difficult to pick even for seasoned professionals with ext
Maybe... (Score:4, Interesting)
um... its April 2nd guys... (Score:5, Interesting)
Wouldnt it just be easier to pass laws making software vendors responsible for the bugs that they produce instead of spending our tax money to provide a shelter for insecure code?
I can see the next big M$ lawsuit...
Plaintiff: Their buggy code cost us millions.
M$: But we follow the homeland security software development model.
Judge: So the software must be good. Perhaps the plaintiff was trying to do something illegal?
Plaintiff: Shit... *sigh*
Re:um... its April 2nd guys... (Score:3, Insightful)
If a critical flaw is discovered later in the car's life cycle, the company issues a recall, notifies car owners and fixes the bug at their expense. (I'm curious, does anybody know how old a car has to be before the manufacturer is absolved of h
Re:um... its April 2nd guys... (Score:2)
Don't forget to check your blinker fluid.
Re:um... its April 2nd guys... (Score:2)
I wouldn't say that that's unlike cars at all.
software vendors shouldn't be liable (Score:3, Interesting)
Security is an engineering tradeoff, just like speed and usability. I don't want every software vendor to have to conform to the highest level of security out of fear of getting sued.
The people who should worry about this sort of thing are the buyers of software. If your car mechanic can't fix your car in time because his PC g
Re:um... its April 2nd guys... (Score:3, Interesting)
That's half an acceptable idea, and half a horrible one.
Not spending federal funds to protect insecure code: good.
Spending federal funds to punish insecure code: bad.
(Notice the pattern here? "Spending federal funds" should be considered a bad thing in general, unless specifically shown otherwise. Smaller government should be
From the summary (Score:5, Insightful)
Adopting a "top-ten" list detailing industry best practices. Patches should be well-tested, small, localized, reversible, and easy to install. Patches would also not require reboots, use consistent registration methods, include no new features, provide a consistent user experience, and support diverse deployment methods.
I thought Microsoft was involved in the partnership. How is that going to work??
This is not a troll. MS patches generally violate some or all of the goals stated above.
Re:From the summary (Score:2)
Presumably, they'd weasel out of it by calling their patches "enhancements", or including new features
Re:From the summary (Score:2, Funny)
Maybe Microsoft intends to improve the quality of its patches?
The company is out to make money; if they can't sell software without following these patch guidelines, then they will follow them.
Patches would not require reboots ?? (Score:2, Funny)
Anyone smell pork? (Score:3, Insightful)
Gov't is expected, in turn, to mandate these measures. Mandating them, of course, requires that gov't money be spent 'fixing' the systems that were flawed.
Hmm. I smell pork.
What's the fuss? (Score:5, Interesting)
Sure, Microsoft and the BSA aren't the bosom buddies of most Slashdot readers. And for good reason. However, a quick look through the 3-page summary document [cyberpartnership.org] revealed what seemed to be a reasonable plan of action, rather than a scheme for total world domination.
Of course, if it turns out that the outcome of the regulation process is Microsoft-controlled security protocols and procedures, then there's something to beef about. However, at this early stage I see nothing more than an attempt to codify a national stance on computer security. Accordingly, I'm going to leave my tinfoil hat in its box for the moment.
Re:What's the fuss? (Score:3, Insightful)
1) Have some committee make up some security standards.
2) Award gold stars to groups that take some security classes, or who create a "security culture" in their companies.
In other words, this is completely useless, and gives the impression that progress is being made. An analogy would be the Academy Awards, where the group of insiders gives out awards to other people who are in the group of insiders, yet thousands
Re:What's the fuss? (Score:3, Interesting)
Re:What's the fuss? (Score:2)
You must have missed this line:
Ensure that Software Assurance and other Information Technology Centers of Excellence include an information protection component.
Isn't Microsoft working on information protection components? How coincidental.
====---====
Re:What's the fuss? (Score:2)
Regards the signature... Just so you know it. Racism against Indians does not consist of telling the bad deeds of the Indian Government or those of various Indian Companies or of the US Government.
barriers to entry, and it won't work (Score:5, Insightful)
These companies are basically trying to erect additional barriers to entry into the software market: costly certification and training requirements, costly documentation requirements, etc. They know that they can satisfy them, but a small software vendor or an OSS project can't.
And they make those recommendations knowing full well that they won't work. If they knew how to make more secure software, they'd already be doing it. A bit of training and certification just is not sufficient for making software more secure.
what seemed to be a reasonable plan of action [...] However, at this early stage I see nothing more than an attempt to codify a national stance on computer security.
What's there to "codify"? What's reasonable about it? There is not a shred of evidence that the "strategy" described in the report will do anything to improve security.
At this point, we have to conclude that people continue to buy insecure software either (1) because they don't have a choice because of Microsoft's monopoly, or (2) because they don't care about security. If (1) applies, then the solution is to break up Microsoft's monopoly and give people a choice in software; then they can pick the level of security they like. If (2) applies, then what business does the government have to force a level of security into products that buyers don't want?
Not a surprise (Score:5, Insightful)
So, how much software do you wanna buy? (Score:5, Insightful)
Help, Help, we might get sued! (Score:5, Interesting)
"[It] is possible that national security or critical infrastructure protection may require a greater level of security than the market will provide," it said. "Any such gap should be filled by appropriate and tailored government action that interferes with market innovation on security as little as possible."
In other words, "The legal climate is such that we are very likey to start getting sued for coding sloppy, insecure software. Rather than properly staffing to test our code, we'd rather have the taxpayers pay for this. This a.) saves us money and b.) puts the responsibility on someone other than us if there is a security problem."
Business calls for U.S. help in Net security (Score:3, Insightful)
Now we see, a shift of responsibility, to the programmers. Lets just try and put as many layers, as possible between the Corp Entity and responsibility as possible why don't we.
"The report said industry groups should work with the Homeland Security Department to look at ways to reduce liability, as well as examining whether new rules are needed."
And now we see a way to tie, the mass collection of data, that the GOV. is asking for, and private industry together.
This is one small step, further towards the Corp, Entity as Goverment.
Re:Business calls for U.S. help in Net security (Score:5, Interesting)
Ok, if they want to make me "accountable" for the code I write, then they better transfer ownership, legal rights, and any profits derived from that code back to me. If they say "it's our code" and "you get no extra cash for writing it" then they can damn well take responsibility for what the code does.
The next, larger step (Score:2)
Here's the next step [nytimes.com]. So very cyberpunk, isn't it?
====---====
Re:programmers.... (Score:2)
And here's why: Apple will be at OS XI, Linux will be running on orbiting brain lasers, and Microsoft will be pushing the release date for Longhorn back to 2050. All the luddites who have a computer for pr0n and email won't care ("Ma com-pew-tor is werkin' jus fahn, yankee.") and the 1337 H@XX0R$ all use Linux anyway.
what a difference a little punctuation makes... (Score:2)
Easy solution (Score:2)
That's it. Problem solved.
Fascinated by the irony (Score:3, Insightful)
How many OS advocates were there? (Score:5, Insightful)
It would seem that computer security would be important for the whole computing community, not just Microsoft, CA, and HP.
Software liability (Score:2)
Make software vendors liable, for, say, the square of the purchase price.
Headline Correction. (Score:2)
> Representatives from Microsoft, Computer Associates, and the BSA
New Headline: Lobbyists for companies that stand to make a lot of money if Open Source / Free Software is made illegal, petition Power-Hungry Politicians protect their business model with taxpayer dollars.
Great (Score:2)
From the report
"Task force co-chairman Ron Moritz said the report calls for a limited government role, such as helping to develop certification standards for software that runs in sensitive systems. "
WOW MS Supporting this.... (Score:3, Insightful)
Re:WOW MS Supporting this.... (Score:2)
> sucked, remember the Magic 8 Ball said, "Outlook
> not so good"
Actually, the Magic 8 Ball said "Outlook Good." It always was full of crap, though. It told me the hottest girl in third grade had the hots for me, too.
Did they just read my post? (Score:2)
Looks like they'll stress that electric/water networks need *extra* security, and then sneak in computer networks, while everybody agrees on the issue.
Pretty weird if you ask me, but this is a comment [slashdot.org] I posted a few days back:
Overdependence on communications (Score:5, Insightful)
by GillBates0 (66
If you read my comment more carefully (Score:2)
I merely pointed out that we have become overdependent on distant resources due to widespread networking. That's it.
Please RTFC before replying to it. Thank you.
Hidden agenda (Score:2, Insightful)
Who's going to sit on the regulatory board? Why, the industry insiders, of course. And they're going to work in the best interests of the established players, which means keeping out the new guys by establishing, among other things, licensing and certificat
Look what they snuck in.... (Score:5, Informative)
Ensure that Software Assurance and other Information Technology Centers of Excellence include an information protection component (Emphasis mine).
Is it any surprise that Microsoft's security recommendations would include Palladium?
====---====
Translation for the gov't-speek impaired (Score:2)
That is all.
EDUCATE IT and CS students on SECURITY!! (Score:2, Insightful)
Too often I hear that schools are not teaching of security. Almost no high school teachers who teach programming even consider security (if they even understand the issues). In college, many schools offer an optional security class. What is up with that. At my school, the assembly language course doesn't even deal with security. New initiatives need to be taken to bring security out of the closet.
When will people learn? (Score:2, Offtopic)
The government loves getting more and more power. More laws mean they get to grow bigger and spend more of our tax money.
Once in place you get a real big, dumb organization that can't fire anyone and will use it's power to try to grow even bigger.
The only people whose opinions matter then are lobbyists with lot's of cash and the people that make money from things staying the way they are.
If the government starts regulating security, they will be even slower to respon
Huh? (Score:5, Insightful)
Although, we all know from the DeCSS case that code "isn't free speech" when it's convenient. So the end result of this would be that the government can tell you what can and can't code.
I was fine with everything in the summary until I got to the "certification" part, but who knows, maybe my tinfoil hat is on too tight.
National Strategy to Secure Cyberspace. (Score:2)
Seems pretty simple to me... (Score:5, Interesting)
Let's see if I got this right...
1. Distribute a development platform called .NET that allegedly does away with insecure coding practices.
2. Influence laws and regs such that any software not coded on a "secure platform" such as yours is illegal.
3. Let the feds regulate your competition out of existence.
4. Profit!
If this comes about, the only way F/OSS software will survive in the US is if both a Linux distribution and a Linux development platform can be constructed that will meet the same requirements that the conglomerate is pushing for. Of course, we're screwed with a capital F if the regs call for technology that Microsoft (or one of the other member companies) has patented.
So I guess now it's "If you can't innovate, litigate... unless of course you have political influence, in which case, regulate!"
Somehow I miss the joke? (Score:3, Insightful)
Can someone who actually read at least the summary please tell me what's so scary. And leave the tinfoil hats off - it gets very tiring.
Puff Piece (Score:3, Interesting)
The report that is...
So they propose that:
Sounds like all these software houses -- who have been touting the superiority of the proprietary development model and decrying the open source development methodology for some years now -- cannot seem to figure out how to adapt their "superior" process to produce secure software. Oh, and let's get academia involved to educate future software developers in the proper way to create secure software. Which means, I take it, that the proprietary software houses have been unable to get their current developers to produce secure software. Following this plan will result in the first crop of (supposedly) secure software developers getting their first jobs in, oh, about 2015.
So... I see this report and the suggestions contained in it as an indication that that Microsoft (and others but predominantly MS) has utterly failed in the attempt to introduce security into their product lines. Even after all of Bill Gates's pep talks and internal memoes. Now they think that creating a bunch of undergraduate courses in secure programming, certifications, and awards to software companies will somehow result in a new breed of software that won't be susceptible to worms and viruses. To me that says: ``We, the proprietary software industry have finally come to realize that writing secure software is quite beyond our capabililties and we make these suggestions so that other people can figure this out for us so that we merely have to hire new people who are already trained to do this. And, of course, these programs should be paid for by the Government.'' No. Strike that. They'd be paid for by you and me. Twice. First in the taxes that would go to create these educational programs and the certification organizations. Then, again, when the price of the software goes up because, well, now it's secure software and that's worth paying extra for isn't it?
Funny that open source software -- and, to be fair, some proprietary software -- isn't anywhere nearly as vulnerable to the sorts of attacks that Microsoft's is. Because, it seems, those Neanderthal open source programmers didn't have the insight to include features that automatically run code by clicking on mail attachments, include scripting languages inside applications that have the ability to destroy user data or launch unrelated programs that damage the local and/or remote systems, or, ... (the list goes on).
Wonder where all those open source programmers managed to learn about writing secure software (yes, yes, yes... I am aware even OSS can occasionally have bugs that affect security) without a college program, certifications, and industry awards? And how do they do it without a government subsidy? Oh, yeah. I forgot. They're able to do it because they don't have some pinhead from Marketing ranting and raving that seven new features need to be in the product in time for the next trade show and there is no time to waste with any discussions about how these features destroy the integrity of the software. Companies like Microsoft won't create more secure software once these programs are in place. Even if they are able to grab every straight-A, magna cum laude graduate of these programs in the country. Why? Because these poor folks are still going to have to answer to some pinhead from Marketing ranting and raving that all these new features need to be in the product in time for the next trade show.
I sure as hell hope that some articulate luminaries in the open source development community have the opportunity to submit a report to the folks that are going to be reviewing this piece of tripe. The opposing viewpoint and an alternate plan needs to be heard.
(Heh. If reading the summary got me this ticked off, imagine if I'd read the entire report!)
Don't worry! (Score:4, Insightful)
We have a Republican president and they control half of Congress.
Since this proposal would extend the reach and powers of the Gov't, it will never pass. Republicans are for a smaller government, remember?
Wait. Why are you laughing?
DCID 6/3 - Security Standardization (Score:3, Informative)
Look at it as a certification process. Each project tasked with protecting data on a computer (networked or not) has a security posture and a security officer responsible for ensuring that the declared posture is enforced.
This is what a bunch of people at /. fear: they expect the government to try and make it all completely secure and fail, but rather what they fail to see that government will only quantify and validate the level at which an information system is protected. This means it's not a black and white world, but rather the level of protection is paired against the threat of compromise.
A bunch of you also think this has only to do with preventing a network-based attack. And while that is in play, don't forget corporate espionage. That foreign temp worker your boss hired could be walking out with all the spreadsheets the accounting department values. This problem, by the way, is addressed in trusted operating systems such as talked about in this article [techtarget.com] asking about Trusted Linux vs. Trusted Irix or Trusted Solaris.
DCID 6/3 works both sides of that problem and quantifies for management what kind of protection their dollars have bought them.
Regulation costs money BSA has money so its easy (Score:3)
Does 'Cyberterrorism" even exist? (Score:4, Interesting)
Even the fairly cohesive stuff like the long-running India vs Pakistan web site defacement battle is just a really annoying flame war.
Re:Does 'Cyberterrorism" even exist? (Score:3, Interesting)
Tell you what.... (Score:3, Interesting)
(see you sometime in 2036)
Is there no end to this man's greed? (Score:5, Interesting)
1) Give M$ a shield from responsibility for the massive insecurity of their software by making a 'security organization' the accountable party. "Software companies" (i.e., mainly M$) would fund the company. The security organization would lay down rules about how bugs and holes are discovered (not a certified programmer? -- then you can't look for/report bugs. See the story of the French scientist who is being sued for pointing out vulnerabilities.), how they are reported (no public reports at all until the patch, if ever, is released, then no announcement as to how long the bug/hole has been open), and how they are released -- through 'special' sites, for a fee, of course, so that the consumer pays even more for M$ bugs.
2) Require programmers to get "security certifications" from "accredited" schools. These are schools which have received funds (guess from whom) to finance/"reward" faculty members who establish such programs. Guess which OS will have certification programs, and which won't be allowed on campus. (Just ask youself which platforms aren't allowed equal billing with Windows on Dell computers.) Programs written by "uncertified" programmers will not be allowed distribution through 'certified' channels. Uncertified channels will be made illegal.
3) No answers as to which programmers gets 'grandfathered' in but the entire MS programming staff would be a good guess.
4) Independent Software Vendors (ISV's ---i.e., OpenSource folks) will have to meet requirements which are, in effect, designed to keep them from developing software drivers for new hardware, effectively locking them out of future markets.
Microsoft, the BSA (enforcement arm of MS licensing), and other companies with less than desirable security records would then use the courts to completely muzzle news of the vulnerabilities in their software. With that accomplished they can essentially shut down their repair operations and move the whole program into the public law enforcement arena, using local and national law enforcement agencies as their "security repair" division. Just remember that French scientist who was sued as a 'terrorist' for revealing security holes in software which the vendor claimed in their ads was "100% secure". This will be in no way different than what coal mine owners did in their efforts to keep slave labor trapped in their mines, but this time it will be consumers trapped into using buggy, insecure software with no alternatives. The end result is that the software will get worse because the incentive to repair is removed and will become more expensive because there will be no Open Source competition.
The current crop of "Security Organizations", most of whom have already knuckled under to Microsoft, will not be needed in the "New Order", but I'll wager most of them haven't figured that out yet and are probably jumping on the bandwagon because they have, like so many companies Microsoft has deflowered and plundered, visions of increased revenues as Microsoft 'partners' in this new scam.
The 'security problem' doesn't need a 123 page report to identify the security problem and create solutions for it. The problem is Windows. The solution is for Bill Gates to spend some of his $50 Billion to fix the code, not buy off congressmen and judges and make their problem a law enforcement issue at the public's expense. Is there no end to this man's greed?
Re:cyber security? we already have that! (Score:3, Interesting)
As a bank, we were well on our way to getting everything ready to go, and then we had our exam and were "asked" to document everything.
Long story short - the regulators tripled the amount of work to do without effectively adding any additional safety to the banking system.
Re:Nothing scary (Score:2)
How about having an uncertified accountant manage your tax returns?
A license (certification) program, if properly implemented, could at the very least remove the egregiously incompetent from the software industry.
Re:Homeland Security??!! (Score:2)
Failure! It never intended to succeed! It is a great success at that.
Homeland Security will give us the most excellent catalogue of the latest Al Qaeda damages and how every step was accomplished. This Faternal Order of Odd Toe Taggers (FOOTT) called the Homeland Security needs to step somewhere else. I have watched their regional drills. Money wasted by the ton as they figure out how to haul away the dead.
Nobody discussed how if poison Gas was used to throw Baking Soda out to catch it or if it was a