Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Security United States

A Need for Greater Cybersecurity 186

otterit writes "A story in the Washington Post discusses how chief executives of U.S. corporations and their boards of directors should assume direct responsibility for securing their computer networks from worms, viruses and other attacks, an industry task force working with the federal government said."
This discussion has been archived. No new comments can be posted.

A Need for Greater Cybersecurity

Comments Filter:
  • Isn't it about time to really assess whether it is absolutely necessary to provide every employee with their own internet access?

    Restricting the internet to a single machine (or battery of machines) that only sent and received external email and forwarded it on to the internal network seems like the absolute maximum internet connection necessary for most businesses.

    Surely employees don't have to surf the web at work?
    • by Anonymous Coward on Monday April 12, 2004 @10:09AM (#8837181)
      Yea, but how will we post on slashdot then?

      Think about the slashdot. Think!
    • Exactly -- in my office, everyone has access to the internet, but really only need access to our intranet. It causes more problems than it solves - viruses, users downloading strange apps, etc...
    • Surely employees don't have to surf the web at work?
      You're asking the wrong question.

      For the last 8 years, I would not have been able to do any of the work I've been paid to do if I didn't have timely access to the web. It's to the point that I now wonder how I was able to have any work done 15-25 year ago!!! Granted, not all work **REQUIRES** it, but if you start discriminating between functions at work, you will get more disgruntling than good work done; it has come to the point that web access is nothing less than telephone access.

      However, granting internet access to employees doesn't mean that the barest minimum security and/or monitoring should not be deployed. In fact, it would be quite foolish to grant unrestricted/unmonitored internet access to employees.

    • by Anonymous Coward
      Many research materials for the scientific industry rely on unfettered internet access. The heads of management want to see results and they don't want to pay to maintain internal libraries. The IT department doesn't want to establish tunnels and VPNs for every available online resource and database. While more secure it would bring availability to a grinding halt.

      The management heads who like to crack the whip need to make a choice: if they take sadistic joy in cracking the whip then they're either go
    • sn't it about time to really assess whether it is absolutely necessary to provide every employee with their own internet access

      How then will we pore through picture after picture of celebrity assess then?

    • A relative of mine works for Oxford Health Insurance, where they have to 'apply for internet access'. This kind of scrutiny hurts company morale, espcially if you are not one of the illuminati whose packets are permitted to pass.

      Interent access at a computer today is something that is taken for granted, it is assumed when you sit at a computer that you will be able to get online, especially at your office. I liken restricting internet access to the removal of Solitaire from office PCs. Sure, your empl
      • by Rikus ( 765448 ) on Monday April 12, 2004 @10:25AM (#8837301)
        > If your IT department doesn't know how to kep a network secure....

        How can they keep a network secure if their own users are working against them by installing crap on their PCs like Kazaa or whatever else they think looks fun? They can't really protect a network if the people inside the network are the problem.
        • How can IT keep users from installing software? Have you heard of restricting administrative access? This gets back to the fact that IT needs to know about securing workstations, has the tools and plans to implement that security effectively, is given the time to implement the plan, and actually implements good security. then there would be less problems directly related to bad security.

          Saying that IT cannot protect machines from their users is saying IT doesn't have a clue about security. Fortunately thi
    • by Tenebrious1 ( 530949 ) on Monday April 12, 2004 @10:21AM (#8837279) Homepage
      Surely employees don't have to surf the web at work?

      No, they don't need to surf at work. However, being a BOFH and cutting off internet access to the employees doesn't do much for employee morale.

      Sooner or later all your good employees will leave, and you'll be stuck with disgruntled employees who don't have the skills to get another job (and are underqualified for the one they have), or recent grads who have no other choice but will leave as fast as they can. You'll lose money in training and recruiting costs.

      Draconian measures might save money in the short run, but keeping employees happy does much more for employee retention.

    • by CrankyFool ( 680025 ) on Monday April 12, 2004 @10:22AM (#8837283)
      What definition of 'absolutely necessary' are we using here?

      Quick anecdote: I used to work for a large company that made web authoring tools. At some point we had to ask ourselves whether we still wanted NFR versions of our rather expensive software available to every employee on the intranet. Was it absolutely necessary for the receptionist to install an HTML editing environment? Creating HTML was not part of his job.

      Our decision was that if our receptionist takes an interest in our own products and wants to play with them, that's a Good Thing[tm Martha Stewart] and should be encouraged. It'll make him more interested in the company and a more committed employee; we might find out that he's actually a decent designer and can contribute more to the company in our web design group. Did the NFR products get 'pilfered' every once in a while? Sure. But I'll bet you that 95%+ of the pilfering that was going on with them was to people who wouldn't have purchased them anyway -- but now were using them, and talking about them (mostly positively, we hoped :) ).

      I work now for a company that doesn't allow general internet access for 90%+ of its employees. I think disallowing general internet access is symptomatic of a certain sort of relationship the company wishes to maintain with its employees and is indicative of how it thinks of them -- and it's not indicative of a particularly high level of trust in, or care for, the employees.

      Left to my own devices, I'd rather put in a robust anti-virus and anti-malicious-code system coupled with employee education and discipline for people who break the minimal rules and then let the employees loose. Will some of them surf during work hours and damage their productivity? Indubitably. I still think that the overall benefit in employee morale and easy access to information is going to be worth the occasional loss from someone who can't control his surfing.
    • by _Sprocket_ ( 42527 ) on Monday April 12, 2004 @10:27AM (#8837309)
      Isn't it about time to really assess whether it is absolutely necessary to provide every employee with their own telephone?

      Restricting telephone calls to a single secretary (or secretarial pool) that only make and receive calls and forwarded messages on to the internal workforce seems like the absolute maximum telephone usage necessary for most businesses.

      Surely employees don't have to make calls (especially personal) while at work?
      • Surely employees don't have to make calls (especially personal) while at work?

        Sure, and every computer system works magically out of the box? What if that "enroll in a health care plan here" site doesn't work correctly? What if I need tech support to come down and install a local administrator account on my machine? My staff assisstant isn't nessecairly the person that I would want to have to talk directly to our help desk on my behalf.
    • Yes , and what's up with all those red swingline staplers, Let's take them away too.
    • by Anonymous Coward
      Surely employees don't have to surf the web at work?

      I am an embedded systems firmware engineer at a small (~20 employees) comapny. In addition, I manage the network here, maintain the workstations and purchase/setup any new computers required. I am going to state unequivocally that I simply could not do my job(s) without Internet access.

      Whether it is finding, downloading and installing the latest drivers for a new or existing system, researching new microcontrollers for new product development, chasing d
    • If you stop employees from using PC's to access the web, they'll use laptops/PDA's with wireless communications.

      If you stop employees from using telephones, they'll use mobile phones or use VoIP.

      Restrict all forms of communication, and you won't keep/recruit the best staff.

      In a research environment, having access to the Internet is essential. Not only is accessing IEEE, ACM portals useful in accessing papers, but a Google search can also root out other papers, and check to see where the commercial an
  • by Anonymous Coward on Monday April 12, 2004 @10:07AM (#8837174)
    Corporations announce they should be responsible for securing their own networks.

    (as opposed to relying on magical network security elves that secure your network while you sleep and provide freshly made footware in the morning)
    • by Anonymous Coward
      magical network security elves

      Do they come with the Magical Server Pixie Dust?
      • I have it ... (Score:2, Interesting)

        by DikSeaCup ( 767041 )
        But I seem to run out Thursday afternoon or Friday morning.

        Seriously, yes, corporations *do* need to take better care of their systems, but I'd hazard a from-the-hip guess that the biggest problem these days as far as worm spreading is concerned is home machines and those in lesser "net developed" countries. In other words, ISP's need to become a little more responsible, and go about figuring out how/who/when to block certain ports from leaving their domain (like, say, 25).

    • (as opposed to relying on magical network security elves that secure your network while you sleep and provide freshly made footware in the morning)
      Where I work, we have undepant gnomes in charge of network security.

      This is very effective: after three or four wedgies, people learn NOT to do some st00pid stuff with Internet Exploder...

    • by ThisIsFred ( 705426 ) on Monday April 12, 2004 @10:28AM (#8837316) Journal
      And the most common Linux executable binary format is named 'ELF'. Coincidence? I think not!


    • (as opposed to relying on magical network security elves that secure your network while you sleep and provide freshly made footware in the morning)


      Wait. This isn't another dig at "offshoring" is it?
    • (as opposed to relying on magical network security elves that secure your network while you sleep and provide freshly made footware in the morning)

      ... and a pony.

      -kgj
  • Open source (Score:3, Interesting)

    by Elektroschock ( 659467 ) on Monday April 12, 2004 @10:08AM (#8837176)
    Hoops?

    So they will finally migrate to open source technology?

    German Gov ITsecurity Agency BSI published a nice migration guide. I would like to see that on the other side of the Atlantic.
  • What? (Score:5, Insightful)

    by Pinky ( 738 ) on Monday April 12, 2004 @10:09AM (#8837179) Homepage
    So the people that use the software should assume liability for not patching holes but the manufacture assumes no responsibility for leaving security holes in their product to begin with? This sound very backwards to me.
    • Re:What? (Score:5, Insightful)

      by Smallpond ( 221300 ) on Monday April 12, 2004 @10:21AM (#8837280) Homepage Journal
      When a patch has been on the web for 6 months, its not the software company's fault that the user company has no policy on updating software, insufficient IT staff, and no end-user training.

      Heck, a lot of companies don't even have a comprehensive software inventory.
      • Downstream Liability (Score:3, Informative)

        by sczimme ( 603413 )

        This paper [cert.org] addresses some of the issues you mentioned.

        ObDisclaimer: I am one of the authors (though no longer at CERT) and express some opinions in the paper re: patching schedules and general due care in this area.
        • Hey, is it worth studying CERTs OCTAVE method for survivability anymore, or is it too outdated now?

          • For those of you wondering about OCTAVE: it is the Operationally Critical Threat, Asset, and Vulnerability Evaluation [cert.org]. (It's not really about survivability as such.)

            Please understand that what follows is my opinion only.

            OCTAVE is interesting: it involves getting input from all levels of the organization to determine what is important to whom and why. This is a pretty effective way to figure out a) what would happen/be affected if $RESOURCE became unavailable, and b) how to best protect $RESOURCE. H
      • Re:What? (Pardon?) (Score:3, Insightful)

        by Pinky ( 738 )
        Well, I might believe that if there were fewer security issues and warnings.

        Shipping an OS with ports open is not a prudent security decision.

        Shipping an OS with ports open with no way to close them save installing an extra piece of software called a "firewall" is infuriating.

        An attitude of security through obscurity a software firm whose software products run on 90% of all desktop computers is naive.

        Using an environment that allows the programmer to make an error that allows a hostile data packet to co
      • Assigning fault (Score:4, Informative)

        by Beryllium Sphere(tm) ( 193358 ) on Monday April 12, 2004 @02:02PM (#8839536) Journal
        >When a patch has been on the web for 6 months, its not the software company's fault that the user company has no policy on updating software, insufficient IT staff, and no end-user training.

        Yesbut.

        It is still the software company's fault that the bug existed in the first place. If the client company doesn't dare install patches because previous patches have crashed the production systems, that's the software company's fault. If the software company's salespeople showed a TCO study that didn't include monitoring for patches, building a regression lab to test patches before deploying them, rolling out patches, and doing this weekly or monthly, then the salespeople misled the client company.

        If your car blows up because you got a recall notice six months ago and you ignored it, your fault. If your car gets three recall notices a week, there's something wrong at the manufacturer.
    • Re:What? (Score:3, Interesting)

      That's the first thing I thought of too. However, how often are security efforts stonewalled by braindead executive types who say "I want security", then later chastise the people who bring it to them for the effect it has on convenience? I'm currently engaged in that exact battle. They said "we want a security system to secure our documents", and when I rolled it out with some basic requirements: you must change your password every thirty days, passwords must be a mix of letters and numbers, and passwords

      • by Pinky ( 738 )
        Yes, you're completely right. When I wrote my comment I was thinking primarily about flaws in software products or dumb system default on a typical home user's PC.

        For large software sites there can be a disconnect between security fantasy and security reality. On the wall of my cubical is a Dilbert cartoon and a memo for IT. The subject of both is password policy. The reason they are on my wall is the new security policy of IT and the fictional one proposed by the IT person in the Dilbert cartoon are ident
  • Weed Them Out (Score:5, Insightful)

    by MrNonchalant ( 767683 ) on Monday April 12, 2004 @10:09AM (#8837182)
    Let business Darwinism takes its course: those that implement effective countermeasures survive and thrive in a competitive marketplace, those that don't...
  • Sarbanes-Oxley (Score:5, Interesting)

    by andy1307 ( 656570 ) on Monday April 12, 2004 @10:09AM (#8837187)
    As part of the Sarbanes-Oxley act, companies are required to conduct some internal security audits to get a 404 compliance certificate [pro2net.com]. Without this certification, the company stock can't be traded.

    Although the stiff penalties outlined in the Sarbanes-Oxley Act initially captured the attention of CFOs, they and their staffs are now scrambling to address the far-reaching but less-understood challenge of complying with the new law, and Section 404 in particular.
    Section 404 requires management to explicitly take responsibility for establishing and maintaining an adequate internal control structure.
    • "As part of the Sarbanes-Oxley act, companies are required to conduct some internal security audits to get a 404 compliance certificate. Without this certification, the company stock can't be traded."

      I'm having trouble finding that document. Every time I think I've located it I get a 404 response.
    • I don't want any legislation for this; but I will say that hitting the executives over the head with the security stick is the real way to improve things.

      They are the ones who trump the security team at my company. We had a nice small tight set of controls until the executives started chipping away at them.

      "But Mr. CEO needs to receive these passworded zip files by e-mail"

      "Mr. VP needs access to port xxx through the firewall."

      Or the best one:
      "We need Mr. Executive VP to have pcAnywhere through the fire
    • by mrnick ( 108356 ) on Monday April 12, 2004 @11:31AM (#8837845) Homepage
      I have been involved in several Sarbanes-Oxley 404 Internal Audits and let me tell you it's an uphill battle. First off I find myself dealing with people working in the financial department. This sort of makes since since 99% of Sarbanes-Oxley focuses on financial responsibility but when it comes to 404 specificly it doesn't make sense. I have been in the situation several times where the 404 internal audit was being funded from the finance department. This puts you in a situation where the IT department is at odds with you. They, the IT department, doesn't know who you are and you need to access all the security aspects of IT and physical security. So, not only do you have to convience the financial types that doing this audit is not optional but mandated by law you have to then convience IT types the same thing and you need access to all of their systems. Both are equally difficult because the financial types have a completly different definition of what an audit is and don't understand that an IT audit requires someone to physically check security of each device and run IDS and penetration testing. The IT people are just as hesitant. They understand quickly why you need to do this but don't want it to be a finance funding person doing the poking around. They want it to be an IT project. Most of the time they have someone in IT that says "heck I can do it" but don't understand the reasoning behind Sarbanes-Oxley's requirement for segregation of incompatible duties. Which means in a nutshell that you cannot be involved in a production or support role of the affected systems.

      Being in the IT Security field I thought that this would be a big boom for my career but I have not seen it yet. 404 cleary states that someone has to be responsible for reporting on the security readiness of the company. I don't see how the audits I have performed meets this requirement. Does the 20+ page audits that I produce make the CFO think he can report on security readiness? I don't think so because security is something that changes on a day to day basis. Plus I would bet that the CFO is an end user to some of those systems (badge reader, workstation, email intranet, etc) and that this would prohibit him from being in that role. If I had the resources I would start a comapny and outsource the security audit and reporting responsibility. The major expense would be advertising / education of the corporations of the need of such a service.

      Anyways, I could go on all day but in summary most corporations have no idea that they need this and the ones that do know don't understand it.

      Nick Powers
  • When you make demands like this, the next thing you know, you'll try to make them directly responsible for their corporate financial statements.
  • So ironic (Score:4, Informative)

    by Anonymous Coward on Monday April 12, 2004 @10:10AM (#8837197)
    Where I work, most of the massive system failures were caused by senior executives meddling at low levels. Not just operationally, but also at system specification time. (How many buzzwords can we put into this spec?)

    That's not to say that IT security and virii aren't devastating. Just that putting clueless buzzword-directive-issuers in charge, instead of those who understand the implications and directly deal with customers, doesn't solve anything.

  • Not likely (Score:5, Interesting)

    by nate1138 ( 325593 ) on Monday April 12, 2004 @10:10AM (#8837200)
    It's hard enough to make them take responsibility for things like overstating earnings and embezzlement. How exactly are they going to be forced to be accountable for this?
    • "It's hard enough to make them take responsibility for things like overstating earnings and embezzlement. How exactly are they going to be forced to be accountable for this?"

      In the good old US of A you are no longer responsible for your actions. You were forced to do it, tricked into doing it, didn't know better, or to you it was ok. Any which way, we have lawyers that manipulate everyone to freedom.
  • by blankmange ( 571591 ) on Monday April 12, 2004 @10:12AM (#8837210)
    Corporations taking responsibility for their own actions -- there is something new. Strangely enough, it is the feds telling them to do this. Do as I say, not as I do...

    Move along /., nothing new to see here...

    • Re:Duh... (Score:2, Funny)

      by Ytsejam-03 ( 720340 )
      Strangely enough, it is the feds telling them to do this.
      I suspect that this group [slashdot.org] is responsible. Microsoft does not want to take the blame when corporations fail to patch the next RPC bug in a timely manner.
  • by shoppa ( 464619 ) on Monday April 12, 2004 @10:12AM (#8837212)
    I know of one large government agency that recently had to turn off all linux machines. Why? There was no anti-virus software installed on them, and the "security czar" required such software on all servers.
    • by ThisIsFred ( 705426 ) on Monday April 12, 2004 @10:32AM (#8837342) Journal
      I know you were modded funny, but, why would servers need anti-virus software, even if they were Windows servers? Do we have sysops that configure servers to execute binaries off of their own shares?
      • By running anti-virus software on fileservers you can avoid problems caused by clients with misconfigured or obsolete AV software. I run AV software on my company's Linux based fileservers for exactly that reason.
      • I know you were modded funny, but, why would servers need anti-virus software, even if they were Windows servers? Do we have sysops that configure servers to execute binaries off of their own shares?

        Servers (especially mail/ftp/file servers, but not so much database/app servers) are a good spot to catch viruses that got past the anti-virus software that is installed on the user's desktops. Especially since user's have a bad habit of disabling or just simply breaking their anti-virus software.

        A good ad
  • by Captain McCrank ( 583414 ) on Monday April 12, 2004 @10:13AM (#8837215)
    If worms, viruses and other attacks can alter or remove financial accounting data, then the execs currently are accountable thanks to Sarbanes Oxley 404. This legislation creates work like Y2k did. If you haven't been impacted by it at your job yet, start reading up now.
    • This legislation creates work like Y2k did.

      True, but unlike Y2K, this one has no expiry date. Each change to the IT infrastructure of a company is going to mean that the CEO/CFO that is now accountable due to Sarbannes-Oxley is going to be sticking their neck a little further out. Sooner or later that person is going to want (and probably get) another audit to cover their ass. Assuming they haven't already factored this into the business strategy of course; security tests on odd-numbered years, PAT te

  • Cybersecurity? (Score:5, Insightful)

    by cybermace5 ( 446439 ) <g.ryan@macetech.com> on Monday April 12, 2004 @10:14AM (#8837224) Homepage Journal
    This is typical. Focus on just one part of a greater problem. The issue is security overall. Your computers can have the most advanced security possible, but it can become useless with a few misplaced words from one of thousands of employees, or a document that missed an appointment with the shredder. When I worked in tech support, I can't count the number of times I found usernames and passwords in plain view on post-it notes...the "security conscious" employees would put them under the keyboard. Outside vendors could see any of this at will.

    The internal network can also be destroyed by a simple click on an email attachment. The real issue here is educating people about computers, and expecting a certain level of competency. To many employees are using something they don't understand; it would be like giving company cars to people who don't know how to remove the keys from the ignition and lock the doors.
  • suggestion (Score:3, Interesting)

    by abrotman ( 323016 ) on Monday April 12, 2004 @10:15AM (#8837232)
    Perhaps some level of legislation would be good. How about a law(only for US) that would outlaw an open relay, requiring each mail server to be configured correctly. Or perhaps something that says an ISP like AOL or Comcast should not permit port 25 traffic beyond its router unless it comes from thier own SMTP server.

    I realize lots of spam comes from overseas, but a lot also comes from aol.com,rr.com,comcast.net,etc.

    Or we could just make commercial software vendors responsible for the quality of thier software.
    • Re:suggestion (Score:3, Interesting)

      by millahtime ( 710421 )
      "Or perhaps something that says an ISP like AOL or Comcast should not permit port 25 traffic beyond its router unless it comes from thier own SMTP server."

      Many of the major ISPs won't recieve email from an IP that is from residential cable/dsl service. Most of this is already being blocked. I know from personal experience that comcast is already blocking port 25 in some areas.

      "Or we could just make commercial software vendors responsible for the quality of thier software."

      Just comercial. What about
    • Not to get sidetracked, but actually -- not a lot of spam comes from aol.com. Check out the headers. aol.com has lately been a damn good corporate citizen on this front.

      Oh, and discounting asymmetric routing tricks, good luck establishing an outbound port 25 connection from inside the aol.com network.

      -roy
  • responsibility (Score:3, Interesting)

    by dj245 ( 732906 ) on Monday April 12, 2004 @10:15AM (#8837239) Homepage
    chief executives of U.S. corporations and their boards of directors should assume direct responsibility for securing their computer networks from worms, viruses and other attacks

    In other words, Homeland security and the FBI blew all their money on booze, cigarettes, and hookers, so now someone else must pay to take care of problems like internet insecurity before they become problems.

    But is it really that simple? Can all security threats be stopped before they start, or should the government be held accountable for part of it? Seems to me like they are trying to lay some responsibility on the big corporations (not a horribly bad thing) but the reasons behind this are not good. I think their attention is focused in the wrong places. Their attitude is that creating colored alert systems and making duct tape warnings is of more importance than securing the global internet infrastructure.

    I guess keeping people focused on the T word (Terrorism)is key to keeping them from realizing that the executive branch really sucks right now.

    • Re:responsibility (Score:3, Interesting)

      by millahtime ( 710421 )
      "Seems to me like they are trying to lay some responsibility on the big corporations (not a horribly bad thing)"

      So, are you saying that Homeland Security or the FBI should come in to and handle security on their network? Isn't it up to a private company to handle it's own security? Or should the US put up one big firewall around the nation and block us off from the rest of the world and manage secutiy that way. Kind of like an old castle moat for cyberspace.
      • So, are you saying that Homeland Security or the FBI should come in to and handle security on their network? Isn't it up to a private company to handle it's own security?

        No, I'm saying that they need so spend more money going after the writers of the internet worms. They give these threats little attention, but I would wager that the associated costs of network downtime and lost productivity have cost the average company far more time and money than any terrorism. I am not one to blindly believe the ant

        • "They give these threats little attention, but I would wager that the associated costs of network downtime and lost productivity have cost the average company far more time and money than any terrorism."

          Terrorists go around killing people. Internet worms are an inconvienance and cost money. There is a big difference.
  • by stecoop ( 759508 ) on Monday April 12, 2004 @10:15AM (#8837240) Journal
    I think it's great that attention is being drawn to security. I think that there should be triple damages for a company releasing data defined private or against any agreement you had pre-arranged. Yet how are you going to protect your data when you outsource your transaction to some place that doesn't live by these rules? You can't. Except recognize that certain corporation outsource and use this information for your decision on who to use. Evaluate it and if you feel that this type outsourcing isn't protecting your data and interests than don't use said corporation.
  • Blame the users (Score:5, Insightful)

    by heironymouscoward ( 683461 ) <heironymouscowar ... .com minus punct> on Monday April 12, 2004 @10:17AM (#8837251) Journal
    1. Allow insecure software to become entrenched with monopoly power
    2. Watch while a global industry in wormware develops to take advantage of this
    3. Blame the users for not preventing it.

    Excellent strategy, which will help enormously. While we're at it, let's stick a large label on new PCs saying "Warning: this PC is likely to infected within 5 minutes of connecting to the Internet, but that's your fault."

    Why... why are companies allowed to sell software that has known defects? Surely it's technically possible to ensure that every installation of Windows XP leaves the shop with all necessary patches?
    • not quite (Score:2, Interesting)

      Surely it's technically possible to ensure that every installation of Windows XP leaves the shop with all necessary patches?

      They probably couldn't find every possible flaw and patch it before it leaves Redmond, not due to technical reasons, but because at some point they must keep income flowing in (please no flames here).

      A 100% bugless windows would probably take a very long time (increase cost, increased consumer price), this is not necessarily a bad thing, but may drive the price of the Windows compu

      • They probably couldn't find every possible flaw and patch it before it leaves Redmond

        No arguments here, but I don't think that's what the OP meant. I read it as "why can't PC manufacturers and retailers ensure that no PC is shipped until all the latest (security) patches have been applied?"

        Seriously - if you buy a new PC off the shelf from a store, what's to stop them from plugging it in and patching it? Not only would it help to reduce problems in the short term, it would demonstrate that it worked! Fro
    • Re:Blame the users (Score:3, Insightful)

      by bruthasj ( 175228 )
      Why... why are companies allowed to sell software that has known defects? Surely it's technically possible to ensure that every installation of Windows XP leaves the shop with all necessary patches?

      If it's that easy, why don't you get back to us once you've got it complete.

      This is not meant to be a Troll, but think about the question and think about politics, bureaucracy, red tape, etc. Oh, and you might want to start your own biz too, that helps put things in perspective.
  • by Jameth ( 664111 ) on Monday April 12, 2004 @10:18AM (#8837258)
    For too long, the 37-member task force said, senior executives have ignored computer security or left it to their technology officers, who might not have the clout or inclination to make necessary changes.

    The problem solution isn't the lack of CEO involvement, it's the lack of clout technology officers have. People seem to ignore the advice of technology advisers of all sorts. If a system administrator says something is insecure, one would think the people who hired them would listen, but they don't.

    This is brilliantly demonstrated by electronic voting. Almost all security experts say it is a bad idea. Almost all technology websites trash the idea. When all the experts in a field so not to do it, the politicians still think it's a good idea. Thus, they are truly fools, for they do not know that they are fools.

    The report is the latest in a series produced as part of an industry partnership [...] Members of the task force included representatives from technology companies

    One of the main flaws to all this: they used representatives from technology companies. Did they never consider talking to security experts? Despite recent changes, the American higher education system has some of the best research institutes in the world, and amazingly enough, there are experts at those institutes! Even better, those experts are relatively unbiased! Oh, the possibilities!

    ...after heavy lobbying from technology companies, the initiative recommended no mandates on the private sector and left it up to the companies to work with the government to devise self-regulatory steps for improvement.

    Strangely enough, that's not the problem. the problem is that there are too many governmental enablers. The government gives all sorts of help to companies who suffer losses from cybersecurity, so they have no motivation to secure themselves. What idiocy.


    I guess that, in general, I would have to say most of these problems are caused by governmental stupidity and corporate vileness, but there is still hope for the future, as there are proposals to force businesses to have regular cyber-security audits, as well as other measures.

    • "Gentlemen, we've got to protect our phoney-baloney jobs." Mel Brooks
    • Look...

      If the janitor[1] comes up to you and says "The front door isn't secure, we need to put a lock on it. He gets ignored. He's only a janitor, gets paid peanuts, what could he possibly know.

      If he puts on a suit and becomes a $200/hour security consultant and charges $15,000 for a security audit coming to the conclusion that, damn those doors should really have locks on them, he will be listened to. That advice is worth $15,000 after all... Isn't it...

      [1] And yes, this *is* how systems administrators
    • People seem to ignore the advice of technology advisers of all sorts. If a system administrator says something is insecure, one would think the people who hired them would listen, but they don't. This is brilliantly demonstrated by electronic voting. Almost all security experts say it is a bad idea. Almost all technology websites trash the idea. When all the experts in a field so not to do it, the politicians still think it's a good idea. Thus, they are truly fools, for they do not know that they are foo

  • by CajunArson ( 465943 ) on Monday April 12, 2004 @10:18AM (#8837260) Journal
    Right now the current level of technology in commercial OS systems (I mean Linux/BSD/etc. too) is not enough to stop worms before they can spread.
    You can (try) to patch all your services and stay ahead of vulnerabilities, but in a very large organization unpatched machines can fall through the cracks, and in a small organization there may not be enough skilled staff to keep everything patched.
    User edjimukation (sic) is all well and good, but unfortunately there will always be a population of Darl's who will willfully ignore best practices and try to do stupid things with viruses and whatnot.
    IMHO there are solutions to at least some of the more stupid problems with security. I think the best ones are through least privilege enforcement with Mandatory Access Controls (see SELinux as one very good commercially available example, I also like Domain & Type Enforcement for Linux too!) With MAC systems root is no longer a god, and you have a much richer ability to limit what user's can do with things like email attachments. Worms can also be contained much better since you define a policy of what a server is supposed to do instead of trying to pattern match every possible type of malware (an impossible job in the long run).
    So why is this rambling post not entirely OT? Well a bigger organizatio like a corporation will have a greater incentive and a greater ability to start experimenting with MAC systems that are both secure and usable in an office environment. Bigger companies have more resources to work with software vendors to iron out bugs and kinks in the system, and then the refined products can start to filter down to consumer grade products, where security is usually almost non-existant. It is a slow process, but we desparately need better methods and technologies than the standard issue patch & pray employed in today's networks.
  • Security is mayhem (Score:5, Interesting)

    by archonit.net ( 762880 ) on Monday April 12, 2004 @10:20AM (#8837268)
    This is gonna land me in deep water but it's definetly a two way affair -
    if the CEO's spend the required money hiring people to take on the responsibility of securing a network then why is it the ceo's fault?

    If the people being hired are not competant, but played the 'i know what im doing' role then it is still their fault.

    The only time I see it as acceptable that the ceo gets the blame is when the ceo him/herself directly contributes to the lack of security or employee laxness.


    The article, imho, is hinting that if a company was to go down due to security problems then it's the ceo who gets the blame if, and when, they are led to believe their networks are (or were in this case) secure/d by an (incompetent) tech-support guy.

    I say it truthfully AND before I become flamebait: I have the utmost confidence for *most* IT people, it's usually the users who contribute to the problem not IT departments, but I truly do, in this case, feel sorry for the CEO (with their huge paychecks and massive perks) when they get the blame for something that they did honestly have a go at fixing/preventing.


    Worms/Virii are designed to be destructive and disruptive and there is little to no way that most users will ever learn that they need to be more cautious about security without having their credit card details exposed by a black-hat or their personal PC brought to a halt by the worlds least advanced virus - becausethe user hadn't patched their virus scanner.

    It's a case of once bitten twice afraid - and if it's kept that way by the community, as long as it doesn't affect me, then I'm all for it - I just hate cleaning up after one has hit.

    New rule for virii - release a strain to the public and release a quick-repair tool at the same time to slashdot!
    • CEOs get massively compensated if their company's fortune rises because they are considered responsible for that fortune through their hiring decisions.

      It makes perfect sense to hold them responsible for the decisions of their underlings if their hiring decisions prove unfortuitous. You'd have a hard time convincing me to feel bad for them if they hired some schmuck to do their internal security and then didn't bother to audit that person independently -- we expect them to do it with the accountants, so w
  • Call Me Crazy... (Score:5, Insightful)

    by nherc ( 530930 ) on Monday April 12, 2004 @10:23AM (#8837286) Journal
    I have always believed that the company creating the software should be held responsible for security holes, bad code, backdoors, etc.,. in their own damn code.

    Given a way to easily update applications (which virtually every useful and enterprise program has in some form) the only way the end-user should be held responsible if is they haven't stayed on top of these updates.

    I can see gray areas where exploits are unknown to the software creators, however once made aware either via direct communications or one of the many vuln/exploit websites they should be required to fix the vunerability in a timely manner.

    What really gets me is that MS for example clearly knows that probably 1/2 of the Windows installs are pirated versions and they purposefully disallow the Windows Update feature on these copies. I'm willing to bet a good portion if not most of the trojaned and wormed zombie boxes out there are of this class. Perhaps if MS just sucked it up and turned on Windows Update by DEFAULT and allowed pirated versions to download AT LEAST the critical security updates the Internet would indeed be a much happier place.

    BTW, I'm a predominantly Windows user most of the time, so don't just file this under 'hating'.
    • by Osrin ( 599427 )
      While it's always good fun to craft an argument that you can quickly turn on Microsoft... what you're suggesting would make it near impossible for a startup to get going, and would probably put most of the smaller software companies out of business in a matter of weeks.
      • I disagree. Writing secure code is not going to significantly change the difficultly of starting a software company. In fact, if you can't write fairly secure code in the first place, or patch once a vunerability is found then you probably shouldn't be writing software for the masses.

        As for supporting pirated software, yeah, it sucks. But, patch the security holes at least. Put the usability bug fixes and new features in a patch that will check for a legit version beforehand.
    • by WildThing ( 143539 )
      What really gets me is that MS for example clearly knows that probably 1/2 of the Windows installs are pirated versions and they purposefully disallow the Windows Update feature on these copies.

      I seriously hope you are joking! Don't get me wrong, I hate Microshaft just as much, if not more, than the next person; however, what you are saying is that a Company that produces a commercial product should support and update that product for any and all persons that steal that product. I, and I hope most others
      • In following that logic, If you steal a car from an auto dealer you should still be able to get service on that vehicle. Or presume there is a recall on that vehicle, you should be allowed to get the repair taken care of ?!? NO F'n WAY!

        The question is whether the theft removes all the manufacturer liability for defective products. I don't believe it does.

        Let's assume that I own a 2004 Ford Exploder and Ford issues a recall for faulty master cylinders that could cause a total loss of braking ability. No
  • by Anonymous Coward

    if executives thesedays where accountable for anything, seems if you wear a suit and grovel enough you can more or less do whatever you want !, just read the newspaper for examples

  • NIST, NIACAP, DITSCAP, ITSCAP, DCID, LMNOPCAP .. UGH!! Heck, the government needs look in house and first. They can't even establish a true "STANDARD" security process for the entire federal government, intel community, and defense department. Everyone wants to work off their own sheet of music. At least a CEO/CIO has to report to the trustees or shareholders if something goes wrong.
  • Chief executives of U.S. corporations and their boards of directors should assume direct responsibility for securing their computer networks from worms, viruses and other attacks...

    This is flat out impossible to achieve without Free and/or Open Source Software. For someone to assume responsibility for their software, they need to be able to proactively deal with defects.

    How can this be done with closed-source software? It can't. Closed-source software (CSS) vendors assume no liability and no responsib
  • Should corporate officers take responsability for security, including the cyber variety? Of course! One wonders about the logistics for measuring their success, but that's not my point.

    The real day-to-day security problem is not in the CEO's office, at least not exclusively. We've all seen or had passwords on monitors, and under keyboards. We've all seen or used a birthday, family member, or pet as a "secure" password. We've all telneted when we should have SSH'ed, or HTTP'ed when we should have HTTPS
  • new theory (Score:3, Funny)

    by MasTRE ( 588396 ) on Monday April 12, 2004 @11:03AM (#8837575)
    Whenever you hear the term "cybersecurity," don't read the article! It's gov't-related, or some other BS. No non-BS sources use it.
  • by DrugCheese ( 266151 ) on Monday April 12, 2004 @11:11AM (#8837657)
    So the U.S. Government points the fingers at all the corperations and says:

    'Because everyone here uses Microsoft and Microsoft can't get their shit straight, we're gonna have everyone here give pay out more money to Microsoft'

  • by zogger ( 617870 ) on Monday April 12, 2004 @11:18AM (#8837732) Homepage Journal
    In meat world, when a "patch" is needed, a recall of a consumer product, the physical object needs to go back to the shop, then gets returned with the fix in place. with software, even when it is provided on disk, this doesn't happen, the old physical media, the CD, is allowed to stay around.

    I think if it's a tangible PROFIT they want, then it's the companies duty to provide a patched TANGIBLE product. They should be required to provide a PATCHED install CD, not just skate on saying "there's a downloadable patch available".

    Example in meatworld. Lst year I found out two of my small cordless drills were recalled. The company paid to mail the old drills back to them, and they sent me new drills "patched"(they were basically brand new drills of a newer "release" style), they DIDN'T just send me via snail mail or email a set of instructions on how to "fix" the drills. I WASN'T required to show where I had bought the drills,nor if I had a "license to drill with them" or anything of the sort. I shipped the b0rked drills off to them on their nickle, I got patched drills back.

    I say apply the SAME rules to software on CD's that are produced and sold for a tangible profit. if they want real money, they need to provide real normal warranties. Make them be forced to take your old CD back at their expense, and have to send you a new CD with the patches, etc. Lather rinse repeat until they bingo it's a much better idea to do it *right* in the first place.

    IF they were forced by law to provide a replacement of their indistry-alleged "tangible" product that they tangibly "profit" from, it would cost them and wake them up. It would cause one of those "paradigm" shifts in the software world, BUT,in the long run, I would be willing to bet that software would be much more intensely audited and tested before it shipped in the future.

    That and there REALLY needs to be a law that eliminates the "nothing is our fault, neener neener neener EULA" crap. If they want a tangible profit, they need to have a similar law applied to them that tangible products elsewhere are forced to conform to. It's called normal consumer product warranties.

    A long time ago I can see the need for software to be given a time frame to get up to speed on development. It is a mature sophisticated,entrenched and profitable industry now, these companies can be forced to be treated as competent adults in the market place if they are selling a product, no different from other industries. And there should be an actual legal time limit for products that are recallable, and it needs to be MANY years. In some cases, forever.
    FORCE them to provide FREE replacement CDs on a one to one basis, no questions asked, that have all the same functionality of the original product, but have had the patches applied.

    As many times as it takes.

    Yes, "recalls" can be expensive to the company,THAT'S THE POINT, it has been shown in every other industry that it works, it is making for much better products in the market place, safer, more functional, better, and these companies are still profitable.

    "Caveat emptor" is NOT the law of the land with other products, because we as a society decided that that sucked, bigtime, and passedlaws about it.

    The software companies want it both ways, to be treated as if all their product is a tangible when it comes to profits and income, but they want no responsibility for their "products". Seriously insecure and malfunctioning products everyplace else get recalled. You aren't forced to become your own mechanic and just told how to fix stuff, even if the part is offered.
  • So, which is /.?

    Developers are responsible for secure code? Or, is it the Users?

    Remember legislation that might effect open source projects into being responsible for the security of their code? Remember the uproar that caused?

    Or is this another friendly gray-area-it-depends-if-it's-convenient-for-my-cur rent-political-agenda issue?
  • by Eric_Cartman_South_P ( 594330 ) on Monday April 12, 2004 @11:27AM (#8837811)
    Going away from "enterprise" sized businesses, and looking at small businesses with 5 or less employees (such as myself) I have everything Mac OS X and I do not care about the 80,000+ windoze virii or trojans. Just... don't... care. I could replace the Macs with lovely Linux and contine to not care.

    The problem is not end users. The problem is not the people writing the virii. The problem is so easy to see and so vanilla that most people have such a hard time seeing something so simple.

    Windows is shit. It's swiss cheese for virii. It is an all around horrible OS. I'm not thinking about far earlier versions and where they got us. That part of MS history was rather nice. But where we are... uh... going today (lol) is to hell in a handbasket.

    Security is not a product, it's a process. And step 1 is to get Windoze off of your servers.

    I await the fan-boys who will scream how Win2K with Service Pack 69 is perfect. Jesus help them...

    • The problem is not end users. The problem is not the people writing the virii. The problem is so easy to see and so vanilla that most people have such a hard time seeing something so simple.

      Windows is shit.


      This is so wrongheaded--Not windows eval, but the rest.

      Yes, OS X is a great, infinitely more secure, OS. Yes, Linux is cool too.

      And YES, the problem is too End Users, and Operators, and Developers, and Blackhats, and well...Us.

      Windows sucks, and it deserves criticism for its security implementati
  • ... is to make a switch to Mac OS X. It'd be costly to buy all the new hardware and software, however, consider that 99% of security problems would be evaporated in one swift move. That would certainly lessen the cost of security in the long run.
  • by SuperBigGulp ( 177180 ) on Monday April 12, 2004 @12:15PM (#8838262)

    If you thought PHBs were bad, just wait until your CEO (or even better), board of directors, starts telling you how to secure your/their computer networks from worms, viruses and other attacks.

    The system you get will be the worst melange of marketing-driven products with all the right buzzwords.

  • Corporate CEO's always have and will continue to put security budgets at the bottom of the priority list until of course their internal networks are compromised. :)
  • by mabu ( 178417 ) on Monday April 12, 2004 @01:05PM (#8838910)
    I think the situation with "cybersecurity" is part of the much larger problem that (at least in America) people these days are reactive as opposed to proactive.

    Our idea of addressing crime is stiffer sentences and more prisons. Reactive, not proactive.

    Our idea of fighting the spam problem is to pass more laws. Reactive, not proactive.

    Most corporations don't really take security seriously until they have a serious security situation (say that 3 times fast) Reactive, not proactive.

    The same thing goes for users. Nobody worries about viruses or worms until the third time they have to re-install Windows. Reactive, not proactive.

    I have clients who know MS Outlook is a bad program, but they're too lazy to "learn something new"; same thing with IE alternatives. They'll spend 2 minutes installing Firefox and if one web site they use doesn't come up right, then they switch back to IE and blame it on the software.

    Our idea of planning seems to involve reaching our hand out to stick a CD in our hard drive which promises to be proactive for us.

    It seems for the majority, our society as a whole always seeks the "solution" to a problem which offers the most instant gratification. We use as an excuse, the adage, "If it ain't broke, don't fix it." even when we know something is broken but it hasn't fallen on our heads yet. The new adage should be, "If it doesn't explode in OUR face, then don't fix it."

    I suspect the true solution to this problem lies in reprogramming the mainstream to appreciate the value of planning ahead and the not-always-obvious cause-and-effect relationship therein.
  • It is reccomended that everyone reading this update to the latest version of their anti-virus software, and keep their operating system up to date by downloading and installing the latest patches from windowsupdate.com.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...