Microsoft: As of October, 1024-Bit Certs Are the New Minimum 207
way2trivial writes with this snippet from Information Week about a warning from Microsoft reminding Windows administrators that an update scheduled for October 9th will require a higher standard for digital certificates. "That warning comes as Microsoft prepares to release an automatic security update for Windows on Oct. 9, 2012, that will make longer key lengths mandatory for all digital certificates that touch Windows systems. ... Internet Explorer won't be able to access any website secured using an RSA digital certificate with a key length of less than 1,024 bits. ActiveX controls might be blocked, users might not be able to install applications, and Outlook 2010 won't be able to encrypt or digitally sign emails, or communicate with an Exchange server for SSL/TLS communications."
Why 1024? (Score:5, Interesting)
System have the ability to go further, why not make 2048 the minimum? Does anyone know why 1024 was selected? I would guess it has to do with some backwards compatibility with something. Some of the issuers are making it next to impossible to go below 2048.
Re:Why 1024? (Score:5, Interesting)
Thinking much the same thing here as well. Even a CA like GoDaddy won't take anything smaller than a 2k cert key.
Most SSL certs we cook up have a 2048 minimum anyway, and some certs we use have a standard of at least 4096 (I work in the banking/financial industry, so we're used to using the bigger keys).
I'm thinking that they stuck with 1024 because most IIS 7.x (Win2k8 Server) allows for a minimum 1024 key size when making CSRs, and (maybe? can't remember) the really old crap (IIS5 or 4?) won't interpret anything bigger, which means enterprises with those old installs will scream bloody murder if they have to re-key but can't meet minimum length.
Re:Why 1024? (Score:5, Interesting)
Re:open source (Score:3, Interesting)
Do you oversee Red Hat's build servers? Did you oversee Debian's SSH build when they fucked it up?
Thanks for so clearly spelling out one of the great advantages of the Linux ecosystem. Namely, that a vulnerability in RedHat isn't necessarily a vulnerability in Debian so the damage doesn't propagate to the overall community of users. That's one of the great things about there being so much diversity and unique approaches to Linux. Again, thank you and I commend you on your evangelism of Linux. People need to know!
Re:open source (Score:5, Interesting)
Not true when kernel.org itself gets hacked.
On the contrary. Which distros actually compiled and released a version of the kernel that was compiled from code downloaded during the window this attack was in effect? If you're running Debian then your kernel is anywhere from just now old to 2 years on the stable version. And if you're doing the right thing and using Ubuntu LTS releases instead of the beta interim stuff then it's the same deal. With Windows, there's only 2 releases to the mainstream. The server and the desktop versions. So whatever kernel MS builds, that's the one everybody uses. With Linux even with kernel.org getting hacked, you have a fighting chance but with Windows, you're done.
Re:Why 1024? (Score:5, Interesting)
On Win2000, US lifted export restrictions only one month after Win2000 RTMed in Dec 1999, so MS had to ship the high encryption pack on a floppy disk inside the Win2000 package in addition to making it available for download. SP2 finally built it in.
Key length is the least of concerns for SSL (Score:5, Interesting)
Re:open source (Score:4, Interesting)
Certainly noone would accuse Google of hiring slouches.
No, but I would accuse them of having hiring practices that discourage creativity (even if their employment practices promote it).
I interviewed with Google a little while back. Right at the start I told them I was not interested in the job they were offering as it's somewhat "below" what I currently do (and would require moving to a more expensive city for a similar level of pay as what I'm on now). They said they'd like to interview me anyway and perhaps after that offer me a job that would better fit my skills.
The short version is that after going through their rather long and drawn out process, involving mind-numbingly boring "solve this well known algorithm problem" questions, they offered me the job that I said I didn't want. After I turned them down, they then sent me a letter saying that "after consideration, we don't think you're a good match for Google".
Personally, I would've really liked to work there. But NOT as a code-monkey on their generic sites. I'm a pretty good developer (although by no means brilliant); but where I really shine is creating new things from scratch. I'm an ideas person with the technical aptitude to put the ideas in to practice. Their hiring process showed me very clearly that they had no interest in my creativity and only wanted someone who can churn code, find bugs, and patch systems to keep them running (all important; but not the only thing in the world; and definitely not for me).
Fragmentation is a good thing (Score:4, Interesting)
Up to a point fragmentation or variety is a good thing. And not just in software. In agriculture, if your field consists of only one crop, your goose is cooked if there's an outbreak of a plant disease. A country whose GDP comes from a single source, say oil or a single cash crop, is also more vulnerable to price fluctuations in the global market. A crash in the prices of that product would lead to a crash in the country's economy as well.
Too much fragmentation of course is bad. But as far as Linux, the major distros are quite few, namely, Ubuntu, Redhat, Fedora, Debian, and possibly Suse. It's their derivatives that give the impression of excessive fragmentation. Derivatives tend to be compatible with the mother distro at least as far as the installation of third party programs not in the main repository. A binary-only printer driver that can run in Ubuntu would be compatible with Linux Mint for example.