Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Security The Internet United States

Tech Companies Ask U.S. to Regulate Cyber Security 371

qtp writes "Wired reports that a group called the National Cyber Security Partnership, which consists of 'leading software companies' including Microsoft and Computer Associates and industry organisations such as the BSA, has asked the Department of Homeland Security to regulate what they call 'Cyber Security'. Representatives from Microsoft, Computer Associates, and the BSA headed the Security Across the Software Development Cycle Task Force that submitted this report to the Bush administration today. (For all of you who dread reading 123 page reports, there is a three page summary available as well. The Washington Post, Forbes, and Other Sources are covering this story as well. I hope this is just another [late] April Fools Day joke, but I'm afraid that this looks too scary to be real."
This discussion has been archived. No new comments can be posted.

Tech Companies Ask U.S. to Regulate Cyber Security

Comments Filter:
  • by A nonymous Coward ( 7548 ) * on Friday April 02, 2004 @01:33PM (#8747863)
    Back in the early 1900s, there used to be a ton of independent phone companies. In spite of using different voltages, ringing systems, etc, they interoperated pretty darned well. But AT&T wantd to be big and was buying them up, and those who wouldn't sell were effectively isolated, the main excuses being interoperability problems. The stink began getting stronger, and eventually AT&T got the government to regulate it as a utility, so it could remain intact and simply be THE phone company. Only the ignorant think regulation was imposed on AT it was their idea.

    This smells to me of the same process. Being sued for security holes would be much more effective at increasing security than some hare-brained government regulation scheme. After having thought up all those EULAs which disclaim all responsibility, and blustered about Linux having no-one responsible, this is just another big corporate scheme to maintain their power and squash the small guys, and place the blame elsewhere.

    The proper way to improve security is invalidate all those EULA disclaimers. A few big lawsuits with billions in damage verdicts would do far more to focus Microsoft's attention than any government regulatory body.
    • You make a good point about affecting large corporations wiht lawsuits, but who gets sued when my linux server gets hacked? I would venture to guess that the average Open source contributer can't afford "big lawsuits with billions in damage verdicts". OSS may be (by design) more secure than closed source software, but if you think OSS is perfectly secure, then i suggest you go do your homework some more.
      • by A nonymous Coward ( 7548 ) * on Friday April 02, 2004 @01:51PM (#8748068)
        I said nothing about open source being more secure. I think it is more secureable, and I think it is better all around, but what annoys me is Microsoft whining that there is no one to sue with open source, when their EULAs have all manner of disclaimer. Microsoft should be sued for fraud. They claim to be more secure, brag about how they are secure, etc etc etc, and yet not only do the security holes continue to roll in, Microsoft blames everybody else for the problems.

        Whereas open source fixes the problems without blaming others.
      • by MrAngryForNoReason ( 711935 ) on Friday April 02, 2004 @02:28PM (#8748522)

        You make a good point about affecting large corporations wiht lawsuits, but who gets sued when my linux server gets hacked?

        In order to claim damages in such a lawsuit you would have to prove that the company in question knew about a vulnerablilty and didn't fix it. Therefore showing negligence on the part of the company.

        To apply this to OSS you first need to distinguish between free and Free software. If the linux distro you were using was Open Source but commercial, meaning you paid money for it, making it Free (as in speech) but not free (as in beer) then the same rules would apply. They would be responsible for damages if they knew about a vulnerablility but didn't patch it.

        If the software was free (as in beer) then the developers shouldn't be held responsible for any flaws in the software. There is no contract between you and them, they have not promised you anything by allowing you to use their software free of charge.

        By making this distinction you make commercial OSS software developers equally liable for negligence without opening up small OSS projects to litigation they have no chance of surviving.

        This is of course all hypothetical as at the moment no software companies accept any responsibilty for flaws in their software. And of course IANAL.

      • > who gets sued when my linux server gets hacked?

        Who gets sued when my Windows server gets hacked? Microsoft, it its EULA disclaims all responsibility, so you can't sue them either. I find it strange that Microsoft's selling strategy is "you can sue us!" Especially since you can't, usually.
    • The proper way to improve security is invalidate all those EULA disclaimers. A few big lawsuits with billions in damage verdicts would do far more to focus Microsoft's attention than any government regulatory body.

      Yeah, that will make a lot companies/independent coders want to release code. Imagine not releasing code until you are positive there are no exploits or holes in your code. I don't see too many claims of *cough* unbreakable software going around save for Oracle.

    • by Kirill Lokshin ( 727524 ) * on Friday April 02, 2004 @01:48PM (#8748030)
      The proper way to improve security is invalidate all those EULA disclaimers.

      You've noticed the same kinds of disclaimers on the GPL, yes? If the warranty disclaimer on a Microsoft license is invalid, what makes the one on the GPL valid; and if it is not, then how would, say, the contributors to the Linux kernel fare if they were sued for a major security breach?
      • Microsoft either denies security problems or blames everybody else (device drivers, end users, 3rd party software). FLOSS developers fix the problems and don't point fingers.
      • Wrong Comparison (Score:5, Informative)

        by Anonymous Coward on Friday April 02, 2004 @02:14PM (#8748346)
        You've noticed how EULA is typically attached to things you pay MONEY for? (and get sued for using if you have not).

        Have you also noticed how GPL'ed products are free (as in speech, but also, often, as in beer).

        Notice how EULA does NOT usually cover things for which you have access to source code?

        The point is simple - when you BUY software, the software VENDOR should carry responsibility.

        GPL'ed software is given away - no money is charged. Thus, the GPL can say "we're just doing this for fun, use at your own risk"

        In contrast, paying money and accepting the license as part of the transaction makes it a contract. The contractor should be held responsible for his work.

        (I know, IANAL, playing fast/loose with the term ``contract'', etc.. But the chief distinction is MONEY)
      • When you get the source code for FREE, the author can disclaim responsibility because anybody in the world can audit/modify it etc. When you pay money for software from a proprietary vendor, you can't take that responsibility on youself. If it's closed, it's not unreasonable to expect said vendor to shoulder the burden. That's the value (or at least should be) of propritary software over open source. On one hand, you can get complete transparency and control in exchange for a much manpower as you are com
    • by globalar ( 669767 ) on Friday April 02, 2004 @01:53PM (#8748099) Homepage
      If MS, CA, and friends have perfect, 100% secure software than I think they should stop hiding it and just sell it outright without the government's blessing. Since they do not, this buddy system might be an alternative to open source software. It could be good, but it could be abused. Considering only big players are involved right now (?), the latter seems more likely.

      From the report, I gather they want to define security and then they can make sure they meet that definition. Make the rules and play by them, at least in legal terms.

      The summary talks about a taskforce to develope "metrics", working with government agencies and get a thumbs-up, develope industry standards, have awards for secure software (can open-source software win?), create a security license accredation program, and make "the security of one's software a job performance factor."
    • by Prince Vegeta SSJ4 ( 718736 ) on Friday April 02, 2004 @01:54PM (#8748108)

      Congressional Hearing

      Bill Frist Testimony...

      Now we will elect a new Security Head - a strong Chancellor. One who will not let this tragedy continue.

      Bill Gates: Mr. President - Members of Congress, if I am elected, I promise to put an end to this CyberTerrorism..."

      Later (to steve Ballmer) I have the Senate bogged down in procedures. They will have no choice but to accept your control of the system.

      Much later, in Seat..(an undisclosed location)

      Steve Ballmer: I bring you good news, my Lord. The war has begun.

      Darth. . er Bill Gates: Excellent. Everything is going as planned.

    • There is no way even Microsoft could survive the lawsuits if you could not sell (or give away) software with a disclaimer of liability.

      I would think the only thing left would be freely-traded software where the original source is very carefully hidden so it is impossible to locate who to sue, and VERY expensive software from companies that buy VERY expensive insurance policies.

      The free software would certainly be almost 100% of what is run anywhere and would include source code, but it would be a very str
    • "Only the ignorant think regulation was imposed on AT it was their idea."

      Yes, it seems that a great many people are ignorant to the true effects of government regulation. Government regulation usually works to enshrine monopoly power by increasing the barriers to entry to competition. It is often sold politically as us against the big corporations, but fundamentally government regulation is designed to give people less choice. Established and wealthy companies can better handle regulations therefore
      • Also note that PARTIAL regulation biases towards the big businesses as well, by providing more subtle barriers to entry, and DEregulation after a sufficient period of regulation biases towards the big businesses as well, by opening up new niches to said big businesses immediately after the regulative die-off. In general, once regulation of any kind is imposed, the people are going to be screwed for a long time to come.

        Of course, in a complete laizzes faire system, dirty tricks and irrational consumer choic
  • interestingly (Score:3, Informative)

    by Anonymous Coward on Friday April 02, 2004 @01:33PM (#8747867)
    they propose that gov't should regulate security in specific industries, like banking or telecom, and not a blanket "one-size-fits-none"
    • Re:interestingly (Score:3, Interesting)

      by TykeClone ( 668449 )
      They do regulate security in banking. It has become a "safety and soundness" issue in the last couple of years.

      Nothing more fun than having a bank examiner talk to you about network security - when they don't know much about it.
  • by GomezAdams ( 679726 ) on Friday April 02, 2004 @01:34PM (#8747871)
    1) "This is your wife's divorce attorney". 2) "Hi. I'm from the government and I'm here to help".
  • by Anonymous Coward on Friday April 02, 2004 @01:34PM (#8747879)

    Business gets .gov to regulate security.

    Regulation and "Approved By.." nonsense costs money.

    MS, et al pay.

    Open Source can't pay.

    Non-approved things can't be used, ergo closed source wins.

    • Why couldn't non-approved things be used? If ISP limited traffic to "approved" sites, you'd quickly find that the vast majority of the internet would be inaccessible (how many home pages, overseas, or "legacy" sites would go for it?). I doubt this would be very profitable (I suppose it could be sold as "safe" for children to a few parents). Additionally, they'd have to advertise with massive disclaimers (like drugs do now); "Does not provide access to the internet; only allows access websites which have rec
      • Non-approved things can't be used, ergo closed source wins.
      Maybe, maybe not. Unapproved drivers/video codecs/etc. in Windows XP still get installed -- a lot. Most people just ignore the warning and continue, and it seems quite a few companies haven't bothered to get their drivers certified by MS. Don't forget the millions of idiots who keep clicking on attachments in E-mails from people they don't know either. I doubt those people would even NOTICE the approved by thing.
    • Yup (Score:5, Insightful)

      by 0x0d0a ( 568518 ) on Friday April 02, 2004 @02:16PM (#8748371) Journal
      Yup, that was pretty much my take on things (Rule 1: industry *never* asks for regulation without an ulterior motive), although I think that there's a bit more to it -- if any cronyism can be used by existing players, it might be a useful tool against challengers, forgetting about Open Source for a moment.

      I'm all for the government issuing advisories, but regulation of security is not feasible. I remember reading about older military software -- the government used to try to do much more comprehensive security reviews of all kinds of software it used with tiger teams. Unfortunately, it turned out the extreme expense of this kind of thing isn't feasible in the real world, and still left holes.

      If I had to give a government recommendation, it would probably be along the lines of:

      * Issue advisiories. There are organizations like CERT that do this. Unbiased (not from a vendor), trustworthy information is difficult to come by.

      * Issue best-practices papers. These are probably most useful to IT professionals, though it might even be a good idea to produce them for software developers. Microsoft recently collaborated with the Fed to produce a set of best security practicees documents for Windows. This is an easy thing to add to a company security policy ("[] must comply with USG Document #135F3 Best Practices"). It just tried to deal with a couple of common misconfigurations. It's *hard* to get this kind of stuff directly from a vendor (which frequently wants to hand out information that will encourage you to buy more or is more interested in putting a positive spin on their mistakes) or a consultant (who frequently wants you to buy more consulting services) or a security software (like a firewall) company, which is primarily interested in scaring companies into thinking that they need security software.

      * Government certification of software intended for non-government use is a bad idea. It takes a long time, allows cronyism, can be used to attack some sections of the market (like most Open Source). It's perfectly reasonable for USG-use purchase requirements, but it's not reasonable for broader use.

      * Producing a classification system *could* be very useful, where the government writes documents describing particular classes of software, but it not responsible for ensuring that a particular version of a program fits into a class of software. For example, a hypothetical class-local/1 might require that:

      a) The software bounds-checks all memory accesses to data at the compiler level (free with some languages like Java, and can be done in C if necessary).

      b) The software does not access the network.

      c) The software does not write to any data files.

      Others useful requirements for various classes of software might be: "The software does not provide privilege escalation within the UNIX operating system's privilege system (as a suid/sgid program or a daemon running as a different user does...there would be an equivalent for the Windows security system)", "All data that the software uses from the network is either exact-match checked or bounds-checked prior to use of any of that data, and a failure to pass checks results in that data not being used" (might be useful for simple network software, like clients of the daytime protocol). The government is great at writing requirements and making them publically available--let's use that. Then, if a company guarantees that they are compliant to a particular document in a contract, there is a clear point that they can be called on for non-compliance. Finally, there would be a market for software that can check software for some elements of compliance. Automated security checking is a major issue -- it's neat, it's more and more feasible (see CMU's Java proof-carrying compiler [cmu.edu] for some neat stuff. The problem is that there are currently no standards written by security folks who know what they're doing, so it's hard for businesses to ask for compliance to a particular level of security, and no tools that can certify programs to a particular level.

      There are probably a lot more suggestions that the government could use, but this is a start...
    • by gminks ( 734161 ) <gminks.ginaminks@com> on Friday April 02, 2004 @02:23PM (#8748465) Homepage Journal
      National Cyber Security Partnership was set up by ITAA [itaa.org]

      ITAA is the lobbying arm of high tech corporations.

      For insight on how ITAA sets up these "blue ribbon panels", read this article [thoughtcrimes.org] about a meeting of electronic voting manufacturers. They brought in Harris Miller, ITAA's president, to see how he could help them.

      Highlights from the article:

      • ITAA felt the industry should help create its own credebility by setting high standards.
      • ITAA suggested "re-engineering" the certification process to make the industry the "gold standard" so they can eliminate "side attacks you are subject to now from people who are not credible as well as people who are somewhat credible
      • Harris Miller offers the following comments on how ITAA company partnerships would handle the public debate about electronic voting:
        "Similarly, when we get press calls and the press says 'Joe Academic says your industry's full of crap and doesn't know what it is doing.' What do you say Harris? The reporters always want to know what are the companies saying?.. And there can be two scenarios there: The companies may want to hide behind me, they don't want to say anything... frequently that happens in a trade association, you don't want to talk about the issues as individual companies. ...I take all the heat for them."

      How is any of that related to the topic at hand? These panels we see approaching the government are coalitions formed by a lobbying firm that is paid to protect the interests of its clients. The panels are made to look as if they are unbiased experts that are only looking out for the good of all Americans. The truth is they want to control the conversation so it seems as if they are the only ones with relevant information on the subject at hand.

      Harris Miller and the ITAA have been doing this for many years, and their MO is always the same. This The National Cyber Security Partnership is nothing more than an extension of ITAA's lobbying efforts.

      displacedtechies.com [htpp]

    • I think you are unduly worried. In all matters, the government tries to be fair to all parties involved, as well as show concern to its constituents.

      For example: It costs $90 to register a corporation (in my state) and $15 annually to maintain that registration. No matter if you have earnings over $1B or just over $100. There is no favoratism, and concern is shown for the smaller low income company.

      Even in the article the author of the report cites concern for open source: "We need to better unde
  • is it just me.... (Score:3, Interesting)

    by chrisopherpace ( 756918 ) <cpace@@@hnsg...net> on Friday April 02, 2004 @01:34PM (#8747881) Homepage
    or is it really hard to take this seriously when Microsoft's name is on it? On the other hand, pretty much anything that MS is involved in (other than anti-trust lawsuits) with the US is equally scary.
  • I can see it now (Score:5, Insightful)

    by Bull999999 ( 652264 ) on Friday April 02, 2004 @01:36PM (#8747893) Journal
    If it's true, MS and BSA will argue that the open-source software has to be stopped because it will let terrorist see the code and come up with exploits based on it.
    • Yeah but you are forgetting about the time when MS said their software is only exploited when the fix for the exploit comes out.
    • by andih8u ( 639841 ) on Friday April 02, 2004 @02:05PM (#8748252)
      The only problem with that paranoid theory is that the government does indeed have quite a few linux servers. They aren't going to shoot themselves in the foot.
  • Maybe... (Score:4, Interesting)

    by Guspaz ( 556486 ) on Friday April 02, 2004 @01:37PM (#8747905)
    NetForce isn't that far off :p
  • by Shirov ( 137794 ) on Friday April 02, 2004 @01:37PM (#8747909) Homepage
    The process sub-group will work with major software vendors and key critical infrastructure customer organizations to encourage and aid vendors in their adoption of the recommended low defect, higher security-oriented practices and processes.

    Wouldnt it just be easier to pass laws making software vendors responsible for the bugs that they produce instead of spending our tax money to provide a shelter for insecure code?

    I can see the next big M$ lawsuit...

    Plaintiff: Their buggy code cost us millions.

    M$: But we follow the homeland security software development model.

    Judge: So the software must be good. Perhaps the plaintiff was trying to do something illegal?

    Plaintiff: Shit... *sigh*
    • The auto industry has solved this problem. If you buy a car and find out it's "buggy", the shop will repair it and, in most states, if the bugs can't be worked out you get your money back or a different car (each state's lemon laws vary, but most states have 'em).

      If a critical flaw is discovered later in the car's life cycle, the company issues a recall, notifies car owners and fixes the bug at their expense. (I'm curious, does anybody know how old a car has to be before the manufacturer is absolved of h
    • Wouldnt it just be easier to pass laws making software vendors responsible for the bugs that they produce instead of spending our tax money to provide a shelter for insecure code?

      Security is an engineering tradeoff, just like speed and usability. I don't want every software vendor to have to conform to the highest level of security out of fear of getting sued.

      The people who should worry about this sort of thing are the buyers of software. If your car mechanic can't fix your car in time because his PC g
    • Wouldnt it just be easier to pass laws making software vendors responsible for the bugs that they produce instead of spending our tax money to provide a shelter for insecure code?

      That's half an acceptable idea, and half a horrible one.

      Not spending federal funds to protect insecure code: good.
      Spending federal funds to punish insecure code: bad.

      (Notice the pattern here? "Spending federal funds" should be considered a bad thing in general, unless specifically shown otherwise. Smaller government should be
  • From the summary (Score:5, Insightful)

    by sczimme ( 603413 ) on Friday April 02, 2004 @01:38PM (#8747912)

    Adopting a "top-ten" list detailing industry best practices. Patches should be well-tested, small, localized, reversible, and easy to install. Patches would also not require reboots, use consistent registration methods, include no new features, provide a consistent user experience, and support diverse deployment methods.

    I thought Microsoft was involved in the partnership. How is that going to work??

    This is not a troll. MS patches generally violate some or all of the goals stated above.
    • Adopting a "top-ten" list detailing industry best practices. Patches should be well-tested, small, localized, reversible, and easy to install. Patches would also not require reboots, use consistent registration methods, include no new features, provide a consistent user experience, and support diverse deployment methods.

      I thought Microsoft was involved in the partnership. How is that going to work??

      Presumably, they'd weasel out of it by calling their patches "enhancements", or including new features

    • MS patches generally violate some or all of the goals stated above.

      Maybe Microsoft intends to improve the quality of its patches?

      The company is out to make money; if they can't sell software without following these patch guidelines, then they will follow them.
  • Are you sure Microsoft is backing this ?
  • Anyone smell pork? (Score:3, Insightful)

    by Anonymous Coward on Friday April 02, 2004 @01:38PM (#8747916)
    Big businesses ask the gov't to step in, because their processes are flawed and produce bad software.

    Gov't is expected, in turn, to mandate these measures. Mandating them, of course, requires that gov't money be spent 'fixing' the systems that were flawed.

    Hmm. I smell pork.
  • What's the fuss? (Score:5, Interesting)

    by Aardpig ( 622459 ) on Friday April 02, 2004 @01:39PM (#8747922)

    Sure, Microsoft and the BSA aren't the bosom buddies of most Slashdot readers. And for good reason. However, a quick look through the 3-page summary document [cyberpartnership.org] revealed what seemed to be a reasonable plan of action, rather than a scheme for total world domination.

    Of course, if it turns out that the outcome of the regulation process is Microsoft-controlled security protocols and procedures, then there's something to beef about. However, at this early stage I see nothing more than an attempt to codify a national stance on computer security. Accordingly, I'm going to leave my tinfoil hat in its box for the moment.

    • I think the fuss should be that it's a waste of time. Many of the recommendations seem to be

      1) Have some committee make up some security standards.
      2) Award gold stars to groups that take some security classes, or who create a "security culture" in their companies.

      In other words, this is completely useless, and gives the impression that progress is being made. An analogy would be the Academy Awards, where the group of insiders gives out awards to other people who are in the group of insiders, yet thousands
    • Re:What's the fuss? (Score:3, Interesting)

      by forand ( 530402 )
      While I usually don't see a reason to stop regulation of an already regulated market( cause someone is already in the lead and removing societies only way to force them to behave doesn't help) but in cases where there is an emerging market I think that regulation, for things other than environmental impact and a few other things, should NOT be implemented. How is this going to help? As noted above all this does is provide software providers with a way of saying: "We followed all the RULES so we didn't do
    • However, a quick look through the 3-page summary document revealed what seemed to be a reasonable plan of action, rather than a scheme for total world domination.

      You must have missed this line:
      Ensure that Software Assurance and other Information Technology Centers of Excellence include an information protection component.

      Isn't Microsoft working on information protection components? How coincidental.

      ====---====

    • Regards the signature... Just so you know it. Racism against Indians does not consist of telling the bad deeds of the Indian Government or those of various Indian Companies or of the US Government.

    • by hak1du ( 761835 ) on Friday April 02, 2004 @02:41PM (#8748649) Journal
      rather than a scheme for total world domination.

      These companies are basically trying to erect additional barriers to entry into the software market: costly certification and training requirements, costly documentation requirements, etc. They know that they can satisfy them, but a small software vendor or an OSS project can't.

      And they make those recommendations knowing full well that they won't work. If they knew how to make more secure software, they'd already be doing it. A bit of training and certification just is not sufficient for making software more secure.

      what seemed to be a reasonable plan of action [...] However, at this early stage I see nothing more than an attempt to codify a national stance on computer security.

      What's there to "codify"? What's reasonable about it? There is not a shred of evidence that the "strategy" described in the report will do anything to improve security.

      At this point, we have to conclude that people continue to buy insecure software either (1) because they don't have a choice because of Microsoft's monopoly, or (2) because they don't care about security. If (1) applies, then the solution is to break up Microsoft's monopoly and give people a choice in software; then they can pick the level of security they like. If (2) applies, then what business does the government have to force a level of security into products that buyers don't want?
  • Not a surprise (Score:5, Insightful)

    by bnenning ( 58349 ) on Friday April 02, 2004 @01:41PM (#8747952)
    Big businesses like regulation. It costs them, but it costs their smaller competitors more in relative terms.
  • by LostCluster ( 625375 ) * on Friday April 02, 2004 @01:41PM (#8747956)
    The BSA isn't just in business to chase down pirates of commerical software, they're also in the business of getting people to buy more. Effectively, what the BSA wants is for companies that don't buy any information security products to get in trouble with the SEC... therefore practically mandating that everybody by something from one of the BSA members.
  • Quote from the Washingtonpost.com article:
    "[It] is possible that national security or critical infrastructure protection may require a greater level of security than the market will provide," it said. "Any such gap should be filled by appropriate and tailored government action that interferes with market innovation on security as little as possible."

    In other words, "The legal climate is such that we are very likey to start getting sued for coding sloppy, insecure software. Rather than properly staffing to test our code, we'd rather have the taxpayers pay for this. This a.) saves us money and b.) puts the responsibility on someone other than us if there is a security problem."
  • by CygnusXII ( 324675 ) on Friday April 02, 2004 @01:42PM (#8747965)
    "The report says programmers should be held personally accountable for security holes in the software they write."

    Now we see, a shift of responsibility, to the programmers. Lets just try and put as many layers, as possible between the Corp Entity and responsibility as possible why don't we.

    "The report said industry groups should work with the Homeland Security Department to look at ways to reduce liability, as well as examining whether new rules are needed."

    And now we see a way to tie, the mass collection of data, that the GOV. is asking for, and private industry together.

    This is one small step, further towards the Corp, Entity as Goverment.
    • by Tenebrious1 ( 530949 ) on Friday April 02, 2004 @02:03PM (#8748221) Homepage
      "The report says programmers should be held personally accountable for security holes in the software they write." Now we see, a shift of responsibility, to the programmers.

      Ok, if they want to make me "accountable" for the code I write, then they better transfer ownership, legal rights, and any profits derived from that code back to me. If they say "it's our code" and "you get no extra cash for writing it" then they can damn well take responsibility for what the code does.

    • This is one small step, further towards the Corp, Entity as Goverment.

      Here's the next step [nytimes.com]. So very cyberpunk, isn't it?

      ====---====

  • "Industry organisations .. have asked the Department of Homeland Security to regulate what they call 'Cyber Security' Representatives from Microsoft, Computer Associates, and the BSA."
  • The industry will quickly take care of things all on their own without government dictation of the hows or wheres. All you have to do solve this multi-billion dollar problem is get rid of the EULA's ability to bypass accountability.

    That's it. Problem solved.
  • by bl8n8r ( 649187 ) on Friday April 02, 2004 @01:46PM (#8748009)
    I find it fascinating that some of the parties involved are standing-on-soap-box-high beating a cyber-security-drum when they themselves have a myriad of security issues to take care of in their own backyard. Seems to me if they can't handle the responsibility, or action required, to make or maintain a resonably secure software product, they have no credibility in a matter such as this.
  • by k3v0 ( 592611 ) on Friday April 02, 2004 @01:46PM (#8748020) Journal
    This is not a troll, but where was RMS and others?
    It would seem that computer security would be important for the whole computing community, not just Microsoft, CA, and HP.
  • Simple....

    Make software vendors liable, for, say, the square of the purchase price.
  • Old headline: Tech Companies ask U.S. to Regulate Cyber Security

    > Representatives from Microsoft, Computer Associates, and the BSA

    New Headline: Lobbyists for companies that stand to make a lot of money if Open Source / Free Software is made illegal, petition Power-Hungry Politicians protect their business model with taxpayer dollars.

  • by bogie ( 31020 )
    So let me guess? Microsoft will "help" representatives draft legislation with Security standards and goals that make it difficult if not impossible for OSS to compete.

    From the report

    "Task force co-chairman Ron Moritz said the report calls for a limited government role, such as helping to develop certification standards for software that runs in sensitive systems. "
  • by Fallen Kell ( 165468 ) on Friday April 02, 2004 @01:49PM (#8748045)
    Patches should be well-tested, small, localized, reversible, and easy to install. Patches would also not require reboots, use consistent registration methods, include no new features, provide a consistent user experience, and support diverse deployment methods. The world is falling apart!
    • > We were all warned a long time ago that MS products
      > sucked, remember the Magic 8 Ball said, "Outlook
      > not so good"

      Actually, the Magic 8 Ball said "Outlook Good." It always was full of crap, though. It told me the hottest girl in third grade had the hots for me, too.
  • But the report said the most sensitive computer networks -- such as those operating banks, telephone networks or water pipelines -- "may require a greater level of security than the market will provide."

    Looks like they'll stress that electric/water networks need *extra* security, and then sneak in computer networks, while everybody agrees on the issue.

    Pretty weird if you ask me, but this is a comment [slashdot.org] I posted a few days back:

    Overdependence on communications (Score:5, Insightful)
    by GillBates0 (66

  • Hidden agenda (Score:2, Insightful)

    by Anonymous Coward
    It shouldn't be surprising that the major software vendors are calling for government regulation and licensing. This is not unusual, the hidden agenda is it protects the established players by making it harder for new players to gain entry to the market.

    Who's going to sit on the regulatory board? Why, the industry insiders, of course. And they're going to work in the best interests of the established players, which means keeping out the new guys by establishing, among other things, licensing and certificat
  • by lysium ( 644252 ) on Friday April 02, 2004 @01:53PM (#8748096)
    From the Summary pdf:
    Ensure that Software Assurance and other Information Technology Centers of Excellence include an information protection component (Emphasis mine).

    Is it any surprise that Microsoft's security recommendations would include Palladium?

    ====---====

  • Microsoft wants a handout from the Feds to clean up Windows bugs.

    That is all.
  • As the three page summary says, we need to teach security when you START to learn to program.
    Too often I hear that schools are not teaching of security. Almost no high school teachers who teach programming even consider security (if they even understand the issues). In college, many schools offer an optional security class. What is up with that. At my school, the assembly language course doesn't even deal with security. New initiatives need to be taken to bring security out of the closet.
  • I'm preaching to the choir here, but:

    The government loves getting more and more power. More laws mean they get to grow bigger and spend more of our tax money.

    Once in place you get a real big, dumb organization that can't fire anyone and will use it's power to try to grow even bigger.

    The only people whose opinions matter then are lobbyists with lot's of cash and the people that make money from things staying the way they are.

    If the government starts regulating security, they will be even slower to respon

  • Huh? (Score:5, Insightful)

    by cptgrudge ( 177113 ) on Friday April 02, 2004 @02:00PM (#8748179) Journal
    Know what this is like? It's like needing a certification from the government in order to publish a novel or article. Of course, it's only to make sure there are no grammatical errors, but if I can't pay the fee, my novel or article can't get published. Or it becomes a crime to read my novel because my grammatical errors might "damage" linguistic purity. And then the government has control over what you can read.

    Although, we all know from the DeCSS case that code "isn't free speech" when it's convenient. So the end result of this would be that the government can tell you what can and can't code.

    I was fine with everything in the summary until I got to the "certification" part, but who knows, maybe my tinfoil hat is on too tight.

  • As an european, does I have to understand the US ambition as an atempt to have some nation make a land grab on cyberspace ?
  • by Glamdrlng ( 654792 ) on Friday April 02, 2004 @02:13PM (#8748336)
    I hate it when corporate agendas are this obvious, it makes me think I'm missing something, but I can't discern it from the obvious scheming. The crafty and subtle plot gets obfuscated by the blatant one.

    Let's see if I got this right...

    1. Distribute a development platform called .NET that allegedly does away with insecure coding practices.

    2. Influence laws and regs such that any software not coded on a "secure platform" such as yours is illegal.

    3. Let the feds regulate your competition out of existence.

    4. Profit!

    If this comes about, the only way F/OSS software will survive in the US is if both a Linux distribution and a Linux development platform can be constructed that will meet the same requirements that the conglomerate is pushing for. Of course, we're screwed with a capital F if the regs call for technology that Microsoft (or one of the other member companies) has patented.

    So I guess now it's "If you can't innovate, litigate... unless of course you have political influence, in which case, regulate!"

  • by __aagmrb7289 ( 652113 ) on Friday April 02, 2004 @02:21PM (#8748431) Journal
    Did the poster read the summary? I mean, maybe the full report is scary, but this isn't. Unless you are scared due to the clear inability of these things to change anything in the short term. But why would that be scary? It's not going to be fixed in the short term by anyone but you and I.

    Can someone who actually read at least the summary please tell me what's so scary. And leave the tinfoil hats off - it gets very tiring.
  • Puff Piece (Score:3, Interesting)

    by rnturn ( 11092 ) on Friday April 02, 2004 @02:24PM (#8748469)

    The report that is...

    So they propose that:

    • certifications
    • awards
    • educational programs
    and that these are going to result in secure software? So they still believe in Silver Bullets.

    Sounds like all these software houses -- who have been touting the superiority of the proprietary development model and decrying the open source development methodology for some years now -- cannot seem to figure out how to adapt their "superior" process to produce secure software. Oh, and let's get academia involved to educate future software developers in the proper way to create secure software. Which means, I take it, that the proprietary software houses have been unable to get their current developers to produce secure software. Following this plan will result in the first crop of (supposedly) secure software developers getting their first jobs in, oh, about 2015.

    So... I see this report and the suggestions contained in it as an indication that that Microsoft (and others but predominantly MS) has utterly failed in the attempt to introduce security into their product lines. Even after all of Bill Gates's pep talks and internal memoes. Now they think that creating a bunch of undergraduate courses in secure programming, certifications, and awards to software companies will somehow result in a new breed of software that won't be susceptible to worms and viruses. To me that says: ``We, the proprietary software industry have finally come to realize that writing secure software is quite beyond our capabililties and we make these suggestions so that other people can figure this out for us so that we merely have to hire new people who are already trained to do this. And, of course, these programs should be paid for by the Government.'' No. Strike that. They'd be paid for by you and me. Twice. First in the taxes that would go to create these educational programs and the certification organizations. Then, again, when the price of the software goes up because, well, now it's secure software and that's worth paying extra for isn't it?

    Funny that open source software -- and, to be fair, some proprietary software -- isn't anywhere nearly as vulnerable to the sorts of attacks that Microsoft's is. Because, it seems, those Neanderthal open source programmers didn't have the insight to include features that automatically run code by clicking on mail attachments, include scripting languages inside applications that have the ability to destroy user data or launch unrelated programs that damage the local and/or remote systems, or, ... (the list goes on).

    Wonder where all those open source programmers managed to learn about writing secure software (yes, yes, yes... I am aware even OSS can occasionally have bugs that affect security) without a college program, certifications, and industry awards? And how do they do it without a government subsidy? Oh, yeah. I forgot. They're able to do it because they don't have some pinhead from Marketing ranting and raving that seven new features need to be in the product in time for the next trade show and there is no time to waste with any discussions about how these features destroy the integrity of the software. Companies like Microsoft won't create more secure software once these programs are in place. Even if they are able to grab every straight-A, magna cum laude graduate of these programs in the country. Why? Because these poor folks are still going to have to answer to some pinhead from Marketing ranting and raving that all these new features need to be in the product in time for the next trade show.

    I sure as hell hope that some articulate luminaries in the open source development community have the opportunity to submit a report to the folks that are going to be reviewing this piece of tripe. The opposing viewpoint and an alternate plan needs to be heard.

    (Heh. If reading the summary got me this ticked off, imagine if I'd read the entire report!)

  • Don't worry! (Score:4, Insightful)

    by dasunt ( 249686 ) on Friday April 02, 2004 @02:26PM (#8748496)

    We have a Republican president and they control half of Congress.

    Since this proposal would extend the reach and powers of the Gov't, it will never pass. Republicans are for a smaller government, remember?

    Wait. Why are you laughing?

  • by Midnight Warrior ( 32619 ) on Friday April 02, 2004 @02:28PM (#8748519) Homepage
    When it comes to security, parts of the government do understand how to do it right. Take DCID 6/3 [fas.org]. This is a policy directive from the Director of Central Intelligence Directorate entitled "Protecting Sensitive Compartmented Information Within Information Systems." This thing really writes the book on quantifying security requirements and matching that against what is actually implemented.

    Look at it as a certification process. Each project tasked with protecting data on a computer (networked or not) has a security posture and a security officer responsible for ensuring that the declared posture is enforced.

    This is what a bunch of people at /. fear: they expect the government to try and make it all completely secure and fail, but rather what they fail to see that government will only quantify and validate the level at which an information system is protected. This means it's not a black and white world, but rather the level of protection is paired against the threat of compromise.

    A bunch of you also think this has only to do with preventing a network-based attack. And while that is in play, don't forget corporate espionage. That foreign temp worker your boss hired could be walking out with all the spreadsheets the accounting department values. This problem, by the way, is addressed in trusted operating systems such as talked about in this article [techtarget.com] asking about Trusted Linux vs. Trusted Irix or Trusted Solaris.

    DCID 6/3 works both sides of that problem and quantifies for management what kind of protection their dollars have bought them.

  • by Facekhan ( 445017 ) on Friday April 02, 2004 @02:44PM (#8748685)
    Regulations cost money and create hurdles. If they succeed in getting laws that require software to be certified as secure by some mixed public-private authority (read BSA, some universities, and the nsa) then free software will just have a complex process to go through before it can be used in government and perhaps even before it can be distributed. Whatever the claims of Microsoft and the BSA their ultimate goal is not security but to prevent the commoditization of software which is going to destroy their business model. Big companies are already warming up to the idea that money should be spent on hardware and support, not on overpriced proprietary software that is not any better than whay they can get for free.
  • by faust2097 ( 137829 ) on Friday April 02, 2004 @02:57PM (#8748835)
    Has there ever been a documented case of actual 'cyberterrorism' against the US? It seems like all the laws and hoopla around it seem to do is hand out extremely long prison sentences to script kiddies. Most of the criminal hacking I've ever heard of was for person gain or just for reputation/attention getting. Has any actual group successfully launched anything that could be considered a terror attack?

    Even the fairly cohesive stuff like the long-running India vs Pakistan web site defacement battle is just a really annoying flame war.
    • You have a good point here. Besides, can anyone tell me the last time a Hacker/Cracker/Script Kiddy or anyone, using a computer, physically injured or killed anyone? I meen come on, last I heard the chance of getting struck by lightning while carrying the winning the lottery ticket to the powerbowl was higher than getting killed due to a computer error or so called 'Cyberterorism'.
  • Tell you what.... (Score:3, Interesting)

    by irving47 ( 73147 ) on Friday April 02, 2004 @03:01PM (#8748872) Homepage
    I'll make you a deal. Pass ONE law about cybersecurity. Make it illegal to run an open relay mail server. See if you can enforce it. We'll know if it works if spam decreases.. If you can, and it does, you can pass another law. See if you can enforce that, too. Then we'll talk.

    (see you sometime in 2036)

  • by Jerry ( 6400 ) on Friday April 02, 2004 @03:04PM (#8748909)
    It is appropriate that this 'report' was released on April 1st. Halloween would also have been appropriate. Here is what it will do:

    1) Give M$ a shield from responsibility for the massive insecurity of their software by making a 'security organization' the accountable party. "Software companies" (i.e., mainly M$) would fund the company. The security organization would lay down rules about how bugs and holes are discovered (not a certified programmer? -- then you can't look for/report bugs. See the story of the French scientist who is being sued for pointing out vulnerabilities.), how they are reported (no public reports at all until the patch, if ever, is released, then no announcement as to how long the bug/hole has been open), and how they are released -- through 'special' sites, for a fee, of course, so that the consumer pays even more for M$ bugs.

    2) Require programmers to get "security certifications" from "accredited" schools. These are schools which have received funds (guess from whom) to finance/"reward" faculty members who establish such programs. Guess which OS will have certification programs, and which won't be allowed on campus. (Just ask youself which platforms aren't allowed equal billing with Windows on Dell computers.) Programs written by "uncertified" programmers will not be allowed distribution through 'certified' channels. Uncertified channels will be made illegal.

    3) No answers as to which programmers gets 'grandfathered' in but the entire MS programming staff would be a good guess.

    4) Independent Software Vendors (ISV's ---i.e., OpenSource folks) will have to meet requirements which are, in effect, designed to keep them from developing software drivers for new hardware, effectively locking them out of future markets.

    Microsoft, the BSA (enforcement arm of MS licensing), and other companies with less than desirable security records would then use the courts to completely muzzle news of the vulnerabilities in their software. With that accomplished they can essentially shut down their repair operations and move the whole program into the public law enforcement arena, using local and national law enforcement agencies as their "security repair" division. Just remember that French scientist who was sued as a 'terrorist' for revealing security holes in software which the vendor claimed in their ads was "100% secure". This will be in no way different than what coal mine owners did in their efforts to keep slave labor trapped in their mines, but this time it will be consumers trapped into using buggy, insecure software with no alternatives. The end result is that the software will get worse because the incentive to repair is removed and will become more expensive because there will be no Open Source competition.

    The current crop of "Security Organizations", most of whom have already knuckled under to Microsoft, will not be needed in the "New Order", but I'll wager most of them haven't figured that out yet and are probably jumping on the bandwagon because they have, like so many companies Microsoft has deflowered and plundered, visions of increased revenues as Microsoft 'partners' in this new scam.

    The 'security problem' doesn't need a 123 page report to identify the security problem and create solutions for it. The problem is Windows. The solution is for Bill Gates to spend some of his $50 Billion to fix the code, not buy off congressmen and judges and make their problem a law enforcement issue at the public's expense. Is there no end to this man's greed?

To the systems programmer, users and applications serve only to provide a test load.

Working...