Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security IT

The Security Industry Is Failing Miserably At Fixing Underlying Dangers 205

cgriffin21 writes: The security industry is adding layers of defensive technologies to protect systems rather than addressing the most substantial, underlying problems that sustain a sprawling cybercrime syndicate, according to an industry luminary who painted a bleak picture of the future of information security at a conference of hundreds of incident responders in Boston Tuesday. Eugene Spafford, a noted computer security expert and professor of computer science at Purdue University, said software makers continue to churn out products riddled with vulnerabilities, creating an incessant patching cycle for IT administrators that siphons resources from more critical areas.
This discussion has been archived. No new comments can be posted.

The Security Industry Is Failing Miserably At Fixing Underlying Dangers

Comments Filter:
  • by jandrese ( 485 ) <kensama@vt.edu> on Wednesday June 25, 2014 @03:16PM (#47318319) Homepage Journal
    It seems like his solution is: Simply don't release code that has bugs in it. Which is kind of like saying that the airline industry would be so much more efficient if we could just get rid of wind resistance.
    • by Lazere ( 2809091 )
      Well, it would.
    • I think the airline industry should concentrate on avoiding airline crashes.

    • More like saying the the airline industry would be much more efficient without human error...in fact it's pretty much the same thing. Wouldn't it work better if planes didn't need safety equipment or redundant safety checks, and all the passengers and crew moved with perfect timing like they were in some kind of dance routine?

      Human error will always exist. Deal with it.

      • by preaction ( 1526109 ) on Wednesday June 25, 2014 @04:11PM (#47318789)

        I'd say the aerospace industry is dealing with it a lot better than the software industry. Perhaps we should get held up to the same standards, maybe then we could earn the title of "(Software) Engineer".

        • I'd say the aerospace industry is dealing with it a lot better than the software industry.

          This is somewhat because the airline industry has been around for far longer, but mostly because their screw-ups usually generate large numbers of dead people.

          • This is somewhat because the airline industry has been around for far longer, but mostly because their screw-ups usually generate large numbers of dead people.

            Or because the FAA holds the airplane manufacturers to an extremely high standard for their software.
            There's no one holding Microsoft or the creator of Flappy Birds to any standard of security.

            /I know /. has some programmers who are familiar with airline standards, so maybe they'll chime in.

            • by penix1 ( 722987 )

              Or because the FAA holds the airplane manufacturers to an extremely high standard for their software.

              Although that may be true, the FAA also requires all the backup systems to software driven indicators to be mechanical. So for example, the flight level indicator is duplicated as a mechanical instrument in case the electronic one fails. Same thing with the airspeed indicator, fuel gauges and other critical gauges. Especially if you are talking passenger aircraft. Many even have mechanical backups for hydrau

        • I'd say the aerospace industry is dealing with it a lot better than the software industry. Perhaps we should get held up to the same standards, maybe then we could earn the title of "(Software) Engineer".

          The problem is that there are subsystems on a aircraft can be transparently seen to be critical or non-critical. A loose latch on door to the garbage bin in the galley is not likely to take the entire plane down.

          The same can't be said of a computer system. Any program that breaks security breaks it for the entire system.

          • Only if your OS doesn't do proper sandboxing and memory protection. If a bug in your browser couldn't result in your OS getting hacked, or a bug in steam didn't mean you got a boot virus- wouldn't that be nice?
            • by Bert64 ( 520050 )

              And what about a bug in the sandboxing?
              Combined with the presence of the sandbox giving the user a false sense of security...

            • The article would classify sandboxing as one of the many layers that the industry has added on instead of fixing the fundamental problem with software development culture that values minimizing time-to-market significantly over security.

              Or maybe I'm putting words in their mouth.
          • by mysidia ( 191772 )

            A loose latch on door to the garbage bin in the galley is not likely to take the entire plane down.

            No cell phones on board!

            And don't even think about having passenger/pilot-accessible Ethernet ports on board connected to your flight control system's LAN.

          • by Bert64 ( 520050 )

            More importantly is the fact that aircraft are operated by trained pilots, and maintained by trained maintenance staff - both of whom have to undergo rigorous tests to ensure they are capable of doing the job and have a very good understanding of the aircraft they're working on.

        • All that would result in is software that no one will buy or want. You want to pay $5000 for your new smartphone because it was held up to the same engineering standards? The reality is in a consumer world people get what they pay for and the vast majority are not willing to pay what it would cost to have the software they use engineered to those standards. If you had a choice between a Samsung Galaxy s5 for $500 and a Brand X with same features but at $5000 because it has software that was designed to tho

          • The cost to the consumer is dependent upon the number of people using the software. If you spend 100 Billion on securing Android that's only a few dollars per phone.
            • who do you think is going to spend 100 billion on securing it? and that will only be for the current release, what about the 100 billion needed next year or the year after. Software is evolving and changing so rapidly that the investment isn't a once off and at those costs a single failed product becomes enough to bankrupt a company.

      • Human error may always exist, but I think the point is that people aren't learning from their errors. With software, you can find a problem, fix it, and then iterate until all the problems that can be encountered are handled. if you build in robust modules there is a point where you start to see less and less errors being introduced into the code. That isn't currently happening. If we really want to, we can build truly bullet proof code modules but it would take a substantial change in the way things are do
    • by jellomizer ( 103300 ) on Wednesday June 25, 2014 @03:27PM (#47318399)

      Well companies can do much more to improve on that front though.
      1. Architect the product, not just build it. All too often the focus is on meeting business objectives and security is added later. An product that was well thought-out and designed handles security as part of the core design as well as the business objectives.

      2. No Back door, design the program so the programmers can't get in without having rights to do so. The password DB should be only managed by the computer and humans shouldn't be able to figure it out.

      3. Infrastructure planning. The Website shouldn't also be the Database server. The Database should only allow access from select sources, and give permissions that are appropriate to the user.

      4. Plan for failure. Figure if someone breaks into the system find way to minimize the impact. Make sure the Salt for your hashes are hard to find, etc...

      • by jhol13 ( 1087781 )

        But the companies exists solely to make profit to their owners. Which means "time to market", which means "security is not an option - until it is really needed".

        For example, I am certain that 99% of Facebook/Twitter/... users don't give a shit how secure it is - especially as they know NSA has unlimited and unaccountable access into it.

      • by Bert64 ( 520050 )

        Not being able to figure anything out is a bad thing, the more complex your system is the greater chance of there being bugs, and if your system is important or widespread enough then *someone* will take the effort to figure it out and probably understand it a lot better than the people tasked with running it.

        Having a complete understanding of how a system works should not allow that system to be compromised if it's well designed. Never rely on obscurity.

    • It seems like his solution is: Simply don't release code that has bugs in it. Which is kind of like saying that the airline industry would be so much more efficient if we could just get rid of wind resistance.

      You could posit that but the actual quote is:

      Without an investment in computer programming education and a major move by software manufacturers to embed software security concepts early into the development process, the problems will continue to get worse, Spafford said.

      which seems fairly reasonable, b

      • by DarkOx ( 621550 )

        Honestly I think the problem is the universities don't actually teach and CS. They don't even teach programing they teach C++, C#, or Java.

        We would be better off if students were taught in their professors boutique language that exists nowhere in industry frankly. That would at least move the emphasis toward general theory and patterns. As it stands today most grands spent all their time memorizing whats in the standard library for whatever language they were taught and don't have any clue how to archite

        • by tibit ( 1762298 )

          CS is a subfield of mathematics. It's useful in software and computer engineering, but it's the engineering field you should be talking about, not a subfield of what is, in essence, an art [worrydream.com]. And yes, I do agree with Lockhart. Wholeheartedly.

        • Oh please! A CS degree is a license to get a coding job and nothing more (any more).
          No employer is going to hire a coder who doesn't have at least 2 years in the currently fashionable language in the dominant ecosystem.

          The geeks you're talking about are Computer Engineers, but if you're not a top-ranked grad from one of the top-12 schools, you're going to wind up as a codemonkey working for an accountant.

        • So the next thing you know unsanitized input is being concatenated onto some string and fed to some cousin of eval() in the language du jour.

          After that, we wait for the user keypress with a system("pause").

      • The smarter approach would be to have third-party auditors and certification bodies give particular programs a rating based on their code and processes.

        Excellent idea. Not sure that the insurance is really needed, the trick is simply to market the certification or auditor groups properly. IT PHBs just love Gartner. They'll quote their releases, follow their reports, and buy everything they say without question. So you need an organization like that on the software or software developer auditor side - Gartner does nothing like that. A similarly positioned organization could easily affect the stock prices or VP funding availability of any software selle

      • by Bert64 ( 520050 )

        And how would these rating agencies select the code they were going to audit?
        They can't audit everything, so they would prioritise... Vendors would pay to have their code audited, and perhaps try to corrupt the process to get a better rating. OSS code would not be able to pay to get audited, and thus would never have a rating at all.

        There are already various governments operating such schemes, they are extremely expensive and slow, with the final result being a small cartel of incumbent suppliers where the

    • Which is kind of like saying that the airline industry would be so much more efficient if we could just get rid of wind resistance.

      Because of my contrary nature, I immediately started wondering if that was actually true. As speed increases, I imagine that fighting drag does get to be harder than fighting gravity, but I don't actually know. But a bigger question is, what about falling out of the sky when your propulsion system fails? No parachutes... you need an active recovery system.

      I think we'd have stuck with trains and boats...

      What would have to happen to physics to eliminate wind resistance?

      • High speed maglev in an evacuated tunnel is a better long range mass transit system. You can power it off of solar panels and windmills, it doesn't generate CO or CO2, and if something breaks you just stop.
      • What would have to happen to physics to eliminate wind resistance?

        Not certain here, but I suspect that lift might also a zero wind resistance issue. Any Fluid dynamics ppl here?

    • Can we please just stick to car analogies?
  • TL;DR version (Score:2, Insightful)

    by Anonymous Coward

    "We have no consequences for sloppy design and we don't hold organizations accountable for bad things."

  • Clearly Eugene Spafford must be put in charge immediately, since none of the rest of us have figured any of this out!

  • by swb ( 14022 ) on Wednesday June 25, 2014 @03:29PM (#47318413)

    But there sure is a lot of money in selling threat paranoia.

    Plus software vendors are apparently immune from product liability, so they never bear any costs for defects that lead to poor security or for implementing security poorly. If they had liability for this I think you'd see a lot fewer security defects, but probably a lot fewer features as well.

    • Also programmers would start getting paid like doctors, so costs would rise. (doctors who's patients were undergoing targeted attacks)
      • Re: (Score:2, Interesting)

        by Anonymous Coward

        Hah. This is too rich. I'm an engineer. An actual engineer in the traditional, licensed variety. I design physical structures that are used by the general population and have to ensure that they are safe for the next 50 to 100 years. Oh, and that they will survive the next 1-in-5000 year earthquake event, etc. I have a whole lot of product liability for what I put out and I can assure you, I do not make the same amount of money as a doctor. Hell, I don't even make the same amount of money as most so

    • by AmiMoJo ( 196126 ) *

      I don't think liability would help. For example car manufacturers are only liable if some design or manufacturing defect causes an accident, not if a third party attacker cut your brakes. You could try to argue that they should armour plate the brake lines but I don't think you would get very far.

      That's the problem with security. If you put the weakest, most puny and ineffective lock on a door, then hang the key next to it with a sign saying "authorized personnel only" it's still breaking and entering if so

  • how will you find time to do it twice?
  • by rujasu ( 3450319 ) on Wednesday June 25, 2014 @03:33PM (#47318449)

    ... substantial, underlying problems that sustain a sprawling cybercrime syndicate, according to an industry luminary who painted a bleak picture of the future of information security at a conference of hundreds of incident responders in Boston Tuesday.

  • by Sleeping Kirby ( 919817 ) on Wednesday June 25, 2014 @03:33PM (#47318463)
    I do have a to agree in that the current development style/strategy (agile development) is less geared towards solid development and more on features and getting stuff out there. I think the article is just saying that they should do less of pushing out features and new things and more on good programming/fix known bugs. Of course putting out a bugless program is near impossible, but there's a difference in better prevention versus better clean-up.
  • Underlying dangers: the user?

    What we should do is research safe alternatives for languages (http://www.rust-lang.org/), more sandboxing of who can access what (SELinux, AppArmor), and better and simpler libraries (LibreSSL). No plugin Auto-run for untrusted sites.

    Antivirus is cool and all, but its not as good as fixing the bugs. Unfortunately it is more profitable.
    • by penix1 ( 722987 )

      No plugin Auto-run for untrusted sites.

      Well, you have 2 flaws right there. First, the verification method for "trusted sites" and second, the trust and verification of the trust authority. So you should have stopped at "no plugin auto-runs."

  • Anti-virus is not a solution to the real problem!? Whaat? How can this be?
    • Just because this thread needs a car analogy, too: Antivirus is no solution for crappy software any more than safety belts are a solution for faulty brakes.

  • Working in this industry at several giant companies, the view is simple - the company works for the stockholders, the stockholders demand ever higher returns, and NOTHING the company does is nearly as important as increasing the short term stock price. So what money is spent on R&D will be spent chasing new "shiny" features and the absolute bare minimum level of security and bug fixes required to "continue leveraging the brand". In the mean time, the business will focus on increasing the productivity

    • by tibit ( 1762298 )

      The company doesn't work for the stockholders. The company has a mission, and the stockholders who don't agree with it are simply not your stockholders in the first place. They don't bother. The founders of a company are free to set the mission as they see fit. The mission doesn't have to be 100% profit- or ROI-oriented. It's perfectly possible to have a public corporation that's after greater things than money. Just because for example Microsoft isn't set up this way doesn't mean it's a law of nature. Far

  • by johnnys ( 592333 ) on Wednesday June 25, 2014 @04:06PM (#47318755)

    The "Security Industry" makes money for the shareholders selling "stuff". Any time they see a problem, they will treat it as an opportunity to sell more stuff, since that is how they make money. If the problem is because the customer has already bought too much stuff, they will still try to sell the customer more stuff since THAT IS WHAT THEY DO.

    So if you want to be secure, what do you do? We all know: You get rid of crappy software, simplify your systems, remove unnecessary cruft and hire developers, network systems people and architects who can build you what you need securely. You do NOT hire the cheapest meat puppets who can find the company website and spell "javascript" and you don't outsource your security to the lowest bidder.

    This requires real effort on the part of the company paying for all this: They need to recognize that the "Security Industry" and their shiny, happy sales droids are just parasites ripping off the public with the "latest and greatest security stuff that will really protect you this time I promise not like all the other times, I really really mean it THIS time!".

    They really need to understand that the RIGHT way to GET Security is to design it in, have the right people building and managing it and proper oversight over all of it. To do that you have to treat it as a profession and a core part of what the company does, not as a "service" or "product" that can be "bought in" or "outsourced" to a low bidder.

    Security needs to be treated as a profession in any company with a significant cyber presence, just like the accounting them, the legal team and the core business functions. Pretending it's "just something that we can buy from a vendor" is short sighted and ignorant.

    • The solution: Make laws that get board members at their nuts if they can be made responsible for security breaches and the loss of data.

      Fines are a matter of risk management and cost accounting. Jail time is what turns heads.

      • Systems these days are so hopelessly complex due to running full-blown OSes (mainly Linux derivatives like Android these days) for convenience that guaranteeing security is practically impossible most of the time since nobody ever knows the system inside-out so everyone is relying on everyone else making their own part of the source tree work properly without unforeseen unexpected interactions between software components and also with the hardware.

        Most developers and companies do not have the time and resou

  • by mrflash818 ( 226638 ) on Wednesday June 25, 2014 @04:17PM (#47318843) Homepage Journal

    Thanks to all of this, and the NSA/GCHQ Orwellian Internet world, I no longer do any commerce online.

    Online for me now is chatting, posting, blogging, /., emailing, sharing source code.

    I no longer do any purchases, or access any online systems that deal with money (banks, credit unions, etc), via the Internet.
    Even in the real world, I try to only get my cash via walk-up to a bank teller. No more ATM use. No more credit card/debit card use, if I can at all help it.

    Is trying to do a cash-only lifestyle a total time suck, and inconvenient? Yep.

    I am certain I can still be a victim, but I am doing what little I can to not be an easier target.

    "Always look on the bright, side of life..." -- Monty Python

  • The title (of both the slashdot post and the original article) is misleading.

    The article cites one Eugene Spatford who observes that, "software makers churn out products riddled with vulnerabilities." That's not the security industry's fault.

    He goes on to tell us that law enforcement is inadequately equipped and that criminals protect themselves by bribing government officials. That's not the security industry's fault either.

    Of the tools the security industry does use regularly he says that, "We’re u

  • by Opportunist ( 166417 ) on Wednesday June 25, 2014 @04:41PM (#47319047)

    Sorry, and I know I'll be very unpopular for this, but the blame is on YOU. Yes, YOU. You there who always have to buy the latest and greatest turd that someone puts into a shiny, sleek piece of plastic and calls it the NEW $whatevergadget. As long as you buy buggy, crappy, spyware-attracting, insecure shit just because OHHHH! SHINY! you get what you deserve.

    Welcome to capitalism. If I can sell you a piece of turd that stinks, why should I waste money on perfume?

  • by EMG at MU ( 1194965 ) on Wednesday June 25, 2014 @04:43PM (#47319067)
    I use to have a retirement account with a certain financial services company. They stored my password in plain text. To recover your password they would physically mail it to you. This kind of stupidity should be illegal. It should be criminal and the company should have to pay fines for being asshats.

    Companies don't fix underlying problems because management doesn't see any value in doing so. They also see no risk in having insecure products. Until there are real financial penalties for blatant incompetence regarding security nothing will improve.
  • I've got over a decade of working on networked, embedded devices. With the exception of content security, I have never in my recollection been on a project where a significant effort was devoted to the security of the system.

    I've worked for a company who made devices which process electronic payments. I asked them about security and whether they ever did an audit. The SW veep's response was "We use SSL."

    No one wants to think about it. Security is a hard problem and it blows budgets. Forgetting about se

  • Up until about 1985 phone sales thieves were more than welcomed to Florida as long as they did not make sales within the state. Local politicians were only concerned with money being brought into town and had no concern about losses by people in other states or nations. Although there was a bit of a crack down it really remains somewhat true today. Cyber crime on an international level may well benefit towns in other nations. After all the thieves buy pizzas at local restaurants and cars a
  • People will run malware for pennies [theregister.co.uk].

    The programmers, sysadmins, and netadmins can only do so much. If you completely lock them down, the users can't do their jobs effectively and/or whine and complain and not buy your software or use your service.

    People do pay more for bulletproof software and systems, but most people aren't buying airliners.

  • The problem is that basically all software is connected to the Internet in some way these days and a lot of the makers of software do not qualify as part of the "security industry" and really have no clue and no interest in making things secure.

  • Systems today are too complex for the users, and even the supposed administrators to understand... And all these added layers of extra "security product" just compound the problem. Many organisations are simply unaware of all the risks because they have no idea how most of these things actually work.

  • Of course, if some morons decide instead of to fix problems to try to exploit them -- and to create a market for them, the problem sure is to grow even more.

    "Yes, this car may be tipping over very easily, but we might need this to assassinate some foreign dignitaries, so we don't hell the manufacturer".

  • It's simple, when ever you hear a developer pass up C for something stupidity overloaded and abstracted like Java, C++, C# or Python, you lose security. When ever you put an IT "professional" in place that doesn't understand how the operating systems work and thinks that Windows is the suitable for the server, you lose security. The fact is when ever you decide to take the easy road out of no-where, chances are you're introducing security flaws. This is a two step issue, first at the development level an
  • This has nothing to do with the security industry, and everything to do with people who prefer to buy the cheapest product rather than a better quality product.

    Further, this will continue to happen as long as the software industry maintains it's age-ist view that 'younger is better'. Younger people are not going to have the experience level of older people, which means they will be much more likely to make all sorts of mistakes that older people (who had also made those mistakes when they were younger, but

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (10) Sorry, but that's too useful.

Working...