Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Security The Internet IT

Serious Web Vulnerabilities Dropped In 2011 34

wiredmikey writes "It's refreshing to see a security report from a security vendor that isn't all doom-and-gloom and loaded with FUD. Web Application Security firm WhiteHat Security released a report this week (PDF) showing that the number of major vulnerabilities has fallen dramatically. Based on the raw data gathered from scans of over 7,000 sites, there were only 79 substantial vulnerabilities discovered on average in 2011. To compare, there were 230 vulnerabilities on average discovered in 2010, 480 in 2009, 795 in 2008, and 1,111 in 2007. As for the types of flaws discovered, Cross-Site Scripting (XSS) remained the number one problem, followed by Information Leakage, Content Spoofing, Insufficient Authorization, and Cross-Site Request Forgery (CSRF) flaws. SQL Injection, an oft-mentioned attack vector online – was eighth on the top ten."
This discussion has been archived. No new comments can be posted.

Serious Web Vulnerabilities Dropped In 2011

Comments Filter:
  • "It's refreshing to see a security report from a security vendor that isn't all doom-and-gloom and loaded with FUD."

    They're doing it wrong. Don't assume that if you can't see it, it isn't there.

    • by Anonymous Coward

      Exactly.
      Read this article for more on that: http://www.schneier.com/blog/archives/2012/06/the_vulnerabili.html
      And this EFF essay: https://www.eff.org/deeplinks/2012/03/zero-day-exploit-sales-should-be-key-point-cybersecurity-debate

      Bottom line:
      "We've always expected the NSA, and those like them, to keep the vulnerabilities they discover secret. We have been counting on the public community to find and publicize vulnerabilities, forcing vendors to fix them. With the rise of these new pressures to keep zero-da

  • Give me address there.

  • there were only 79 substantial vulnerabilities discovered on average in 2011.

    It's one data point, isn't it? What exactly are they averaging here?

  • by Anonymous Coward

    Vulnerability statistics for all CVE data are available here : http://www.cvedetails.com/vulnerabilities-by-types.php
    Statistics for all CVE data are also similar to White Hat report.

  • by fuzzyfuzzyfungus ( 1223518 ) on Saturday June 30, 2012 @04:42PM (#40506857) Journal
    Unfortunately, 'Mark Zuckerberg', 'The Nation State', and 'Google' remain on the list of outstanding serious web vulnerabilities, leading some to wonder whether it would be necessary to introduce a system weighting the seriousness of vulnerabilities as well as merely enumerating them...
  • by Pf0tzenpfritz ( 1402005 ) on Saturday June 30, 2012 @04:56PM (#40506911) Journal

    a security report from a security vendor that isn't all doom-and-gloom and loaded with FUD?!

    OMG, what next? A calf with two heads? We're doomed!.

    • A calf with two heads? Do you have any idea how awesome that brand of head cheese would be? They could probably charge double per lb.
  • The most serious web vulnerability sits in the chair.

    --
    BMO

    • by gweihir ( 88907 )

      Indeed. And fixing it can take up to 45 years...

    • That is a false perception.

      The story describes XSS and flash vulnerabilities. Not people who click "DOWNLOAD HERE". Almost all Windows users who have said they do not run AV software and are clean are infected heavily. Mainly it is just a bad ad that uses flash to root the system as even Slashdot had one a few months ago that I reported to them.

      Worse are the idiots who feel XP is superior and run their account as a full administrator

      Most users know better today but I have had my system hosed with a flash ad

      • by bmo ( 77928 )

        >Mainly it is just a bad ad that uses flash to root the system

        Oh I know all too well. We had that on Investor Village once.

        >That (user error is the biggest part of malware propagation) is a false perception.

        The user is not always to blame and drive-by installs exist. There is a caveat to this: the vast majority of web based malware comes from pages designed to trick the user into downloading and installing something - social engineering.

        We can call this the "dumb user problem" since there are no oth

        • Users are just that, "Users". They are not pedantic wannabe security gurus who think they actually know what they are doing. They just want to run their applications. Most users have better things to do with their time than sitting around nitpicking obscure security issues, most of which can only be duplicated in a controlled lab environment using specifically defined steps. Those who talk about nothing but OS security vulnerabilities never seem to realize the purpose of an OS is for running applications.

    • And more importantly, there tends to be a confusion between the part in the chair and the part approximately 30 inches above it.

  • Websites are so god awful and packed with 10 dozen scripts, flash, embedded garbage now they are their own viruses.

  • by gweihir ( 88907 ) on Saturday June 30, 2012 @05:29PM (#40507119)

    As I see no technical reason for web-applications to be less vulnerable, my guess is that black-hats that find vulnerabilities are just more careful with them in order to be able to exploit them longer.

    The other reason I see is that the metric is wrong. It may just be that the vulnerability-types have changed and the metric used but this report has not kept up.

    Anyways, no reason to celebrate. Practical IT security is still in a very sad state and I do not see this changing anytime soon. By now I believe that the currently active developer generations have to retire and be replaced by ones with security-awareness. As this "new" generation is still not being educated, the problem will be with us at least for several decades.

  • It seems the crackers are now using dirty sites and SEO to attack ignorant users to them instead of targettng legit sites and injecting them with malware for drive byes like before.

    Anyone else notice when searching for something techical in Google you will see comments which are identical in like 5 sites where 4 are just copied from the 5th? Some do not even have domain names as AV software can detect and block these. The comments are copied to make the site hit SEO numbers and have tons of ads that play vi

    • I always go back and block those domains in Google. I don't know if they use that information, for ranking, but at least my own results are cleaner.

      • I clicked them before. They just throw ads that do click fraud mostly and of course have download this here to fix it! Which of course is malware.

        I didn't know you could report those domains. I should. I never click on the ones with IP addresses only. The point is the bad guys are now using this as AV software and newer versions of Windows are more protected and improved. No one uses IE 6 anymore to browse the web and most prefer Chrome now so these kinds of infections are harder as zero exploits are fixed

  • Seems like this last year or so there have been a far larger number of companies reporting their data being compromised than in past years.

    In any case, I'd say between lulsec and anonymous, the hunt and the arrests of these asshats might just be causing them to lay low for a while.

  • by MtViewGuy ( 197597 ) on Sunday July 01, 2012 @08:05AM (#40509889)

    I think the vulnerabilities are dropping because the three most commonly-used browsers, Internet Explorer, Chrome and Firefox, are all being patched and/or upgraded on a fairly frequent basis for a couple of years. Besides Microsoft's once-a-month (sometimes more) patches for IE, Chrome and Firefox are now on much faster update/patch cycles, and I think that has cut down on the number of issues with browser-based malware attacks.

For God's sake, stop researching for a while and begin to think!

Working...