Microsoft Makes Major Shift In Disclosure Policy 65
Trailrunner7 writes "Microsoft is changing the way in which it handles vulnerability disclosures, now moving to a model it calls coordinated vulnerability disclosure, in which the researcher and the vendor work together to verify a vulnerability and allow ample time for a patch. However, the new philosophy also recognizes that if there are attacks already happening, it may be necessary to release details of the flaw even before a patch is ready. The new CVD strategy relies on researchers to report vulnerabilities either directly to a vendor or to a trusted third party, such as a CERT-CC, who will then report it to the vendor. The finder and the vendor would then try to agree on a disclosure timeline and work from there." Here's Microsoft's announcement of the new strategy.
Paging Tavis Ormandy, Paging Tavis Ormandy! (Score:5, Insightful)
Mr. Ormandy, I think you know what to do. I really found it amusing that they called the blog posting "Bringing Balance to the Force" when it looks to be completely defined by Microsoft with little or no input from the community.
In other words... (Score:2)
"Same old sh_t, different day."
Re: (Score:1)
None, They just redefine darkness as the new standard.....
I love that one...
Re: (Score:2)
How may Microsoft technicians does it take to change a light bulb?
None, They just redefine darkness as the new standard.....
I love that one...
The day Microsoft builds a product that doesn't suck is the day they build a vacuum cleaner.
I love that one more...
Re: (Score:2)
Re: (Score:2)
On the internet? That'd be a first.
Re: (Score:1, Insightful)
Right... so that is motivation NOT to help M$...
what is the motivation to report to them?
Microsoft has an obligation to protect their customers from security vulnerabilities by responding to them, one they abdicate constantly.
Security researchers have the obligation that ANY academics have. Tell the truth, show your work.
Re: (Score:2)
I'm just saying I feel it IS the public's responsibility not to make potentially dangerous information available to people with malicious intent.
I have no love for MS. I just feel everyone is better off with "Hey you morons, look at the latest exploit" instead of "Hey, general public including innumerable black hats, look at the latest exploit"
Re: (Score:3, Insightful)
Re: (Score:1, Insightful)
Re:I don't get it? (Score:4, Interesting)
Switching the majority OS to GNU/Linux would have one immediate and obvious benefit: the source is widely available and widely modifiable. If we find a vulnerability, it can be diagnosed and patched immediately, without having to wait for a corporation's blessing. Hell, you don't even have to wait for the kernel team's blessing, or any other governing entity. Just post the patch and tell people about it!
It used to be clear that *nix systems were more secure, because they were actual multi-user systems. Nowadays, it's less clear. I'm certain a properly set up SELinux system is still miles more secure than Windows 7, but it's unlikely a common user will have that. However, even if there is no security advantage, I know this: Linux may not be more secure, but it is certainly easier to keep secure.
Re: (Score:3, Insightful)
I fear that you are a troll. Nonetheless...
first off the majority of people wouldn't be able to immediately diagnose and patch because they have no idea how to do that.
Yes, but this does not negate the fact that there are many more eyes looking for flaws. A minority of a ton of people can still be a ton of people. The fact that anybody could diagnose and patch immediately is the important part.
second because linux is open source you would be less secure because it is easier to find flaws and backdoors in a system that you can view its source code.
Yes, and not all of those who find these flaws would exploit them. Many would fix them. Also, as pointed out many times on Slashdot, security through obscurity is not security at all.
and since linux uses a general public License if they request to see your source you have to give it to them because it requires that derivative works also fall under GNU's general public license.
This is a misinformed statement. The GPL requires that an
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Nooo not this shit again! (Score:2)
Linux machines are often the servers that have everyone's credit card numbers, trade/military/government secrets, massive processing power and commercial-grade Internet connections, VoIP servers, and all the other real goodies. Each Linux machine is a potential Fort Knox in a world of 7/11s.
And even though these are the minority these days with most Linux machines being home PCs and geek tinker toys, if any Linux machine is accessible from the Internet on port 22 it will be hit with ssh brute force attempts
Re: (Score:2)
Re: (Score:2)
This is true only in the same sense that the surest way to world peace is to kill everyone that threatens world piece.
Re: (Score:3, Interesting)
I'm not saying it's the public's job to troubleshoot their shoddy code and develop fixes.
I'm just saying I feel it IS the public's responsibility not to make potentially dangerous information available to people with malicious intent.
I have no love for MS. I just feel everyone is better off with "Hey you morons, look at the latest exploit" instead of "Hey, general public including innumerable black hats, look at the latest exploit"
That does kind of depend quite heavily on the researcher being the first to find the vulnerability, and the vendor allocating enough people to adequately deal with fixing it in a timely manner.
Can you say with any real supportable evidence that either statement is a safe assumption? Because I know I can't. And to be honest, I doubt any researcher worth their title can either. Including the guy who I imagine kicked this new policy off by disclosing one he discovered when Microsoft were palming him off with v
Re: (Score:2)
A general sense of moral obligation not to aid and abet criminal activity?
Fuck that.
How about "Oh shit, this affects us. OH SHIT!"?
Microsoft-Spurned Researcher Collective (Score:2)
I guess they achieved their ends and I wonder if Microsoft will be collaborating with the MSRC in the future. :rolleyes
Following Google (Score:4, Insightful)
Looks like Google's policy announcement from July 20 [slashdot.org] rattled some MS cages.
Re: (Score:2)
Still no apology to Tavis Ormandy. Even though they basically admitted he was right.
Re: (Score:1, Insightful)
How does giving a company 5 days to fix an exploit right? If anything this looks like an effort by MS to get the researchers to agree to work with MS so that the details aren't released before a patch is ready. What possible reason is there for releasing this stuff anyway? Does it make anyone safer? Unlikely. Most people don't care enough about security in the first place. All the early release of the exploit does is give lazy hackers more ammunition. Cause let's face it even if MS fixed these within 24 hou
Re: (Score:1)
What possible reason is there for releasing this stuff anyway? Does it make anyone safer? Unlikely.
On the contrary, the answer is "possibly". If I know the nature of a security hole in Program X, I might be able to find a way to substitute, sandbox or discontinue Program X in my own workflow and thereby become safer.
Re: (Score:2)
motivation (Score:5, Insightful)
What is the researcher's motivation to spend the extra time working with Microsoft? They certainly have no obligation to do anything Microsoft asks...
Personally, I prefer the Google and Mozilla method whereby researchers are paid a bounty of a few thousand dollars for reporting vulnerabilities in the manner the vendor prefers. Microsoft would be wise to follow the leaders rather than invent their own convoluted process.
Re: (Score:3, Funny)
Even with $40+ billion in the bank, MS would go broke really quickly with that model...
[/snarky]
Apples to Oranges (Score:5, Funny)
Personally, I prefer the Google and Mozilla method whereby researchers are paid a bounty of a few thousand dollars for reporting vulnerabilities in the manner the vendor prefers. Microsoft would be wise to follow the leaders rather than invent their own convoluted process.
There's a fundamental problem with your comparisons. When a security bug is released in Firefox you see the Mozilla Foundation marvel at the cleverness of the attack. Then a distributed net of individuals quickly work together in an agile way to get the hotfix out and then sometime is spent testing and hardening that fix. When a security bug is released targeting Chrome or any of Google's products, you see Google developers that are comfortable on their campuses swing long hours and work together to push out a fix as quickly as possible. These are all sensible approaches to security bugs.
...
...
With Microsoft, however, you see the heavy thudding of a big corporation. You see a complex inner working of management slow things down. Somebody might ask for an estimate on how much money this is going to cost and that estimate comes back a week later. Senior management starts shredding documents. Engineers start falling from helicopters in Redmond. A tornado of chairs leaves several injured. Microsoft's campus looks like the superdome following Katrina. People are chained to their desks. The reason they ask for 60 days is because that's how long it takes FEMA aid to reach Microsoft
You just can't compare the two
Re: (Score:2)
funny + insightful = +1 funful
Re: (Score:2)
IOW: MS is too big to turn on a dime.
MS has become what they were striving to replace: IBM.
Re: (Score:1)
They've done what they've set out to do, then?
Or did you mean to throw in that they didn't want to be like IBM.
Re: (Score:3, Insightful)
More like they can't. A problem may be a simple fix inside a problem module, but it's also got to go through rounds of testing to make sure that simple fix actually doesn't break anything. After all, even doing stuff like implementing LUA showed how badly things broke (see Vista).
The problem when you're the giant is you attract all the developers. The problem is, most developers write crap for code, and do things they
Re: (Score:3, Insightful)
Except that scale is not the fundamental problem, organizational culture is.
Re: (Score:1)
IOW: MS is too big to turn on a dime.
That's right. MS sells software FOR the agile business, not software WRITTEN BY an agile business.
Re: (Score:2)
With Microsoft, however, you see the heavy thudding of a big corporation. You see a complex inner working of management slow things down. Somebody might ask for an estimate on how much money this is going to cost and that estimate comes back a week later. Senior management starts shredding documents
Honestly? Really? You don't think they have high/critical priority bugs, which get instant visibility right up the escalation tree, managers pushing the rest of the people to get a fix quickly? I've worked for some "big corporations", and when the shit hits the fan, the pressure from above increases immensely. Everyone mucks in, works long hours, gets stuff done.
Big companies can sometimes take a long time to change direction, or "get it" - but when it's something as fundamental as a very large security hol
Re: (Score:2)
This video may provide some insight:
YouTube - Clay Shirky: How cognitive surplus will change the world [youtube.com]
Bug finders are both producers and consumers of the entire actions and consequences in the process. Finding and reporting security bugs is a civic action (as opposed to a communal one). Having the bargain for this action be based on economics instead of social rewards and punishments may have an adverse affect. So, it may be possible that people who get paid for reporting the bugs may feel that they have
Sudden outbreak of common sense (Score:4, Insightful)
So they are formalizing common sense into a policy.
It is a lot better than the previous formal policy of bat-shit crazy.
Anything's fine, as long as they communicate (Score:2, Insightful)
Re: (Score:1)
amen. Ahem, why is this flamebait?
Re: (Score:2)
You know your bank, your hospital, your tax center has it.
You know that there is an option to deactivate as a workaround.
You know that many people are actively searcging for this kind of vulnerabilities and it may be exploited right now.
And you see Microsoft claiming their product is the best and the most secure everywhere.
You can wait, yes, but I am unsure of the more responsible way of acting.
Good luck getting Apple to agree (Score:5, Informative)
Re: (Score:3, Insightful)
I will clarify this for you.
Apple is an insular and paranoid company. They are built upon the myth that the Mac/iPhone/iPad/iPod platform is "safe". They are selling an image: of computing platforms that are safe and secure for the end-user. Reality does not agree with Apple.
Most responsible researchers will play Apple's game, and part of their game is sending out inaccurate and vague responses as to when they may (or may not) fix what vulnerabilities have been found. I think it's helpful for people to
My first thought (Score:1)
Assuming I care (Score:2)
If I happened to run across a vulnerability tomorrow I might be inclined and would likely publish it that very day. Microsoft assumes I care for the well being of them and their customers when really I don't. I know this is aimed more at security researchers but then again they may very well feel the same way.
Here's a radical idea: (Score:3, Funny)
Here's a radical idea: How's about they don't release code tons of fresh code every cycle, and instead maybe check the code over first for buffer overflows, NULL pointer abuse, heap munging, and all the other obvious ways of executing code?
Just sayin'
OSS vs CSS vulnerability reporting (Score:4, Insightful)
CSS: find a bug, see a lawyer, contact a CERT, wait several weeks for a response, sign an NDA, share vulnerability informations, wait 2 months, ask for status, wait for an answer for 4 more months, realize that the vendor will do squat about the vulnerability as long as his customers don't know how threatened they are, release the infos to the public to put pressure on the vendor, be threatened by the vendors lawyers, be called a criminal by the vendors customers and the press and politics, have a house-search, wait 2 more months, get patch, realize that it doesn't fix the problem, rinse and repeat
Please define "ample time" (Score:2, Troll)
I am very curious how Microsoft defines "ample time" especially considering some of their vulnerabilities (like the one recently "patched" in the DOS subsystem) have existed for years or decades.
This isn't a slam at Microsoft, it's a hope that someone has some clarification that can be used as a context to determine if this statement means anything. Even when the terms of their statements are less ambiguous, they seem to find ways of backpedalling - thus greater clarity on something so very ambiguous is w
Re: (Score:2)
LoL, someone who doesn't know much about computers got mod points. One can choose not to like the truth, but, as even Microsoft themselves admitted, this is NOT a change in policy - it's a change in NAME only.
Re: (Score:2)
Why is this modded "troll"?
"Insightful" is more appropriate. Near as I can tell, this post is dead on.
REN "responsible disclosure" "CVD" (Score:1)
"[CVD] is the same thing as responsible disclosure, just renamed," repeated Reavey. "When folks use charged words, a lot of the focus then is on the disclosure, and not on the problem at hand, which is to make sure customers are protected, and that attacks are not amplified."
http://www.computerworld.com/s/article/9179546/Drop_responsible_from_bug_disclosures_Microsoft_urges [computerworld.com]