FSF Responds To Microsoft's Privacy and Encryption Announcement 174
An anonymous reader writes "Microsoft announced yesterday their plans to encrypt customer data to prevent government snooping. Free Software Foundation executive director John Sullivan questions the logic of trusting non-free software, regardless of promises or even intent. He says, 'Microsoft has made renewed security promises before. In the end, these promises are meaningless. Proprietary software like Windows is fundamentally insecure not because of Microsoft's privacy policies but because its code is hidden from the very users whose interests it is supposed to secure. A lock on your own house to which you do not have the master key is not a security system, it is a jail. ... If the NSA revelations have taught us anything, it is that journalists, governments, schools, advocacy organizations, companies, and individuals, must be using operating systems whose code can be reviewed and modified without Microsoft or any other third party's blessing. When we don't have that, back doors and privacy violations are inevitable.'"
PR Stunt at best (Score:5, Interesting)
Re:PR Stunt at best (Score:5, Insightful)
Re: (Score:3)
In fairness, it would not require "free software" to accomplish the openness. It would however require the source code for the encrypting software to be freely available to review, inspect, compile, and compare to what is installed.
"Free" software does this for you by nature, but a company could do the same thing. Microsoft "won't", but absolutely "could". Sun did it, HP has done it, IBM has done it, Cisco has done it, etc.. etc...
Microsoft would not do this however, because it would open up the nasty cr
Re: (Score:2)
And here I am assuming no backdoor in hardware or firmware, which in 2013 is quite a leap of faith.
Open Source has a hard time providing the minimum trustable stack (BIOS is the current obvious weak link) and I don't see microsoft doing that any time soon.
Re: (Score:2)
Re: (Score:2)
Even with access to the source, we're talking about running services rather than code you run on your own hardware. There is no reason to believe that the source they provide is the same as they're running, and there's no way to tell who else has access to their systems.
Most other big providers such as google and yahoo run most of their stuff on open source software, so while we have the code we have no way to tell what they're doing with it.
Silly question (Score:5, Insightful)
How would I find out, personally, that Linux Mint is sharing keys with the NSA? The likelihood that I would personally discover that secret is somewhere between slim to none. I can't read code well enough, nor am I likely to spend the time necessary to read every line of code in the programs.
My assurance stems from,
1. Thousands (at least) of other end users actually do peruse the code, looking for errors, back doors, exploits, etc.
2. My OS comes from a "trusted source" - one which I personally trust.
Yes, there is a weakness in there. That weakness is, I have to trust someone. At the same time, there is a strength hidden right beside the weakness. I get to CHOOSE who I trust.
What, exactly, has convinced you that you can actually trust Microsoft? Has MS invited you to personally examine their code, to satisfy yourself that there are no exploits in their system? No? I didn't think so.
Linux, on the other hand, invites me to read any or all of their source.
You choose what you want, I'll choose what I want, thank you very much.
Re: (Score:2)
What, exactly, has convinced you that you can actually trust Microsoft? Has MS invited you to personally examine their code, to satisfy yourself that there are no exploits in their system? No? I didn't think so
Microsoft has many thousands of former employees who once had access to the source, with little to lose from anonymous whistleblowing. There are likely as many eyes on any important bit of MS code as open source code, given the size of the company. The backlash for getting caught lying is huge. That's why all the "big lie" companies are so pissed at the gag orders that accompanied their demands for information.
Let's not forget than open source vendors are just as vulnerable to this sort of arm twisting - a
Re: (Score:3)
Microsoft has many thousands of former employees who once had access to the source, with little to lose from anonymous whistleblowing. There are likely as many eyes on any important bit of MS code as open source code, given the size of the company. The backlash for getting caught lying is huge. That's why all the "big lie" companies are so pissed at the gag orders that accompanied their demands for information.
Lets see. So what you're claiming here is that every employee at Microsoft not only has access to every piece of code but that they've actually gone through that code detailed enough to spot any NSA Easter eggs.
First, have you ever worked on a large proprietary software project? From my experience it's lucky if three people even look at any given piece of code much less take the time to really understand what it does. Even in support mode they're typically only going to look close enough to fix the bug the
Re: (Score:2)
I'd say the better analogy would be a school that doesn't allow parents on the campus -- ever, and you send your kid there and one day they come back with an education. You ask them how it went; "I don't know, OK, maybe." A typical answer for a kid that you expected, right?
Another school allows parents to come in and tour the campus, and even visit classes any day they like as long as they aren't interrupting a class. You can meet with teachers and talk to them off hours and get a phone number to call if th
Re: (Score:3)
Are you going to trust that 1 person in that million to do it for you?
Proportionally, very few programmers write Open Source software. And yet here we are, with the Linux kernel, Firefox, GCC, etc. It's always a minority that get things done. The fact that most users of Open Source don't read the source-code doesn't render irrelevant the (proportionally) few that do.
You're delusional if you think it's better than the alternative to trusting MS.
It's substantially better than trusting MS. In the closed-source model, they barely even have to bother hiding the backdoors. You are deliberately prevented from vetting the program.
If I build you a car, and I ref
Re: (Score:2)
And how are you expecting to find out if you have access to the source? If a Linux distro is sharing keys with the NSA? Or even built in exploitable vulnerabilities. It's not like there's going to be a commented subroutine that stands out. A series of unrelated conditions that are hard to impossible to spot can be enough. Widely used OSS software have had undiscovered critical vulnerabilities for decades.
I'm no crypto expert. I really know very little about it except there are keys that used to encrypt the information. But, would it be possible for the OS vendor/maker to simply allow the user to enter another key when installing the OS that, without that key, it would be very hard to decrypt the information? If I'm not completely bulloxed on this the NSA would have to get the owners key to make any sense of the data. I am assuming/trusting that the OS does apply the owners key to all user generated data. T
Re: (Score:2)
There's something like that for drive encryption (in-case someone gets your hard drive), but programs that are running are intentionally past that restriction.
Programs communicate with one another, with servers on the Internet. If it sends-out more data then it should, you can't do anything about it if it's from a closed-source client. If that server shares more than you'd like, you can't do anything about it either. The closed-source client (& client OS) prevents the verify stage of "trust, but verify"
Re: (Score:2)
Re: (Score:2)
Also, specifically in regards to a "Linux distro sharing keys with the NSA," if you're that worried about it, fork it and take care of security yourself. Use your own keys. One of the major benefits of free software is that you're n
Re: (Score:3)
FUD! Sure, Debian "could" provide keys to a default service at installation time. The amount of eyes watching what happens in the Distro would ensure that the community knew of such a cookie cutter key. It would be announced, and patched to generate a new key dynamically. The beauty of OpenSource is that it's not just "The Bobs" that knows what happens. There are thousands of people that test, because they enjoy testing. There are thousands that patch because they enjoy patching. All of this process
Re: (Score:3)
http://techcrunch.com/2013/09/27/microsoft-received-37196-gov-requests-for-data-impacting-66539-accounts-in-first-half-of-2013/ [techcrunch.com]
Re: (Score:2)
Look, we are all worried over nothing; this encryption means that only one 256 bit Key will unlock your data, or a paperclip -- but it's merely coincidence that the encryption has a two locks on the door and one of them is always the same.
I'm waiting to see Clippie on an upcoming episode of "VH1; Where are they now?"
Re:PR Stunt at best (Score:5, Insightful)
we are going to do everything we can within current technical and legal bounds to address this for them
My point is that they are not doing everything they can, they are instead they are pursuing a cosmetic measure that will make no real difference to what customers are concerned about. How about, for example, providing me with the ability to use my own keys that are never stored on a MS system?
Re: (Score:3)
If the fundamental core of your argument is that MS* will just give the Private keys to the government ... then what is to stop any company, open software stack or otherwise from being forced to do the same??
If the key management and use, by virtue of using open SW and trustworthy HW, is stifted to you (the end user), at least you know that someone is after you when a letter with the demand to give up the keys comes up in your mailbox.
Who cares? (Score:2, Insightful)
Who cares if the software is non-free? That's not even the issue.
"Microsoft announced yesterday their plans to encrypt customer data to prevent government snooping. "
And I bet Microsoft will just hand over the encryption keys / passwords to the NSA.
Re:Who cares? (Score:5, Insightful)
Who cares if the software is non-free? That's not even the issue.
You are correct, the issue is that it must be open source and build-able from source.
Re: (Score:2, Insightful)
Who cares if the software is non-free? That's not even the issue.
You are correct, the issue is that it must be free software and build-able from source.
FTFY.
Re: (Score:2, Troll)
Re: (Score:2)
Re:Who cares? (Score:5, Insightful)
Right. Because No Such Agency would never be able to find a way to read data encrypted by an open source program. Why, that's a magical band-aid for everything!
It makes things more difficult for them. Instead of having a neat backdoor they either have to insert obfuscated code, which could be detected or replaced at any time or convince people to use weak algorithms. Being open source people can select any algorithm they want - AES, Twofish, Serpent, Elyptic Curve, or rot13. The chances are that not all of them will be compromised. (if they all are then open or closed source doesn't matter - you're screwed either way)
algorithms (Score:3)
Not only can the end user choose which algorithm, they can come up with their own. The right to read and modify the source code ensures that the truly paranoid can modify that source code, in whatever way they choose, to actually ensure that their stuff is secure.
Little Joey Nerd decides that he really, really, REALLY doesn't want anyone to read his stuff. Three pass encryption results - first with Blowfish, then with his own home brewed encryption, and finally with AES. So, the attacker understands AES
Re:Who cares? (Score:5, Insightful)
And I bet Microsoft will just hand over the encryption keys / passwords to the NSA.
Things like these are still a step forward, as NSA has to actually ask for the keys from companies, instead of just passively snooping everywhere it wants to.
Re: (Score:3)
Though it's worth noting that Microsoft has a history of being particularly inept in implementing encryption. Best intentions, sadly, does not make for secure code.
Re: (Score:2)
And I bet Microsoft will just hand over the encryption keys / passwords to the NSA.
Why does Microsoft even need my private key? Take e-mail, for example. I have a private key locally and a public key that I share with those needing to correspond with me. Someone needs to send me a message, they look up my pubkey, encrypt their message and send it through the tubes. I decrypt it upon reciept using my privkey. Why is Microsoft not in the business of managing public keys for its users and forwarding messages? That's basically all we need. Its the founding principle of the Internet. Push all
Re: (Score:2)
Re:Who cares? (Score:4, Informative)
Bad security uses "security through obscurity". Those types of systems become useless once you know how they work. Examples of this would include puzzle locks, ROT13 encryption etc.
Re: (Score:2)
The problem with Windows is not that you're in jail, the problem is that you don't know that you're in jail because you have no way to inspect the spurce code and make yourself and understanding about it.
Re: (Score:3)
Must be in a jail when I use firefox, too, since i have no way to inspect that source code and have an understanding of it either (Im not a trained software dev qualified to analyze several million lines of code).
Yes, all non-programmers are in a jail, at all times.
Re: (Score:2)
There's a world of difference between a locked door and an open doorway.
Re: (Score:2)
Im not sure theres a single person out there who has an understanding of the entire firefox codebase. What you propose is a practical impossibility.
Re: (Score:2)
Predictable (Score:3, Insightful)
So, Microsoft finally does something no geek could object to and the FSF's response is "even if this looks like a good thing, this can't be a good thing because it's proprietary". It just makes me wonder why they bother making a statement; it's proprietary, it always is and it always has been.
Re:Predictable (Score:4, Insightful)
"Without access, you can only take them on trust" would seem to be the FSF's actual argument. I don't honestly believe that people would actually compile all their tools from source code they've reviewed personally to check for security holes, but at least represent their argument accurately.
Re:Predictable (Score:5, Interesting)
>> I don't honestly believe that people would actually compile all their tools from source code they've reviewed personally to check for security holes
We do use some open source in our aviation products. We are required to heavily review literally every line of source code (both ours and open source) in order to get our product certified for aircraft use.
Re: (Score:2)
Some industries and experts can do this, but for the great majority of users trusting open source is quite similar to trusting a commercial vendor. The NSA has teh resources to flood discussion groups and review sites with posts that generate a false sense of security if they choose to do so.
In the end I think the LAW is the only thing that can provide protection.
Re: (Score:3)
>> but for the great majority of users trusting open source is quite similar to trusting a commercial vendor.
Not at all. The point with opensource is you at least have the freedom to look at the code, (whether you choose to invest the time required or not is up to you). Also, chances are if something nefarious is in the code, someone working on the project will spot it and it will be outed.
None of the above is true with commercial closed code, especially from big companies like Microsoft who have alre
Re: (Score:2)
Re: (Score:2)
The fact you know about those vulnerabilities in the first place exactly makes my point for me.
Re: (Score:2)
The fact you know about those vulnerabilities in the first place exactly makes my point for me.
No it doesn't, not at all. It actually proves the point that you can review it all you want and there will still be bugs, likely some that exist right now that won't even be discovered for another decade.
Re: (Score:2)
I think you're confusing bugs and backdoors explicitly put in place.
Re: (Score:2)
I think you're confusing bugs and backdoors explicitly put in place.
Nope, the discussion is about security holes [slashdot.org]. If you think that "security holes" means explicit backdoors and that when I said Linux kernel vulnerabilities [slashdot.org] that I meant explicit backdoors then you are the one who is confused.
Re: (Score:3)
If I had any reason to distrust some software, then I could always pay someone to perform an audit/code review and see what's going on (e.g. TrueCrypt has been inspected since the NSA relevations to see if the binaries are different to the publi
Re: (Score:3)
In addition to your points, the option for people to look at your code makes your code better because it makes you more diligent when you write it.
I suspect everyone has had a conversation like this :
Bob : check out my awesome-sauce application. it's bad ass
Boss : cool. give Jeff access to the source code. i'd like him to integrate it into our Fabulosity suite.
Bob : er, ok. just give me a couple of days to clean up the code so it is ready for integration.
(translation, give me a couple days to fix all the fu
Re: (Score:3)
> So, Microsoft finally does something no geek could object to and the FSF's response is "even if this looks like a good thing, this can't be a good thing because it's proprietary".
Ah, I finally get to use a car analogy!
Your car has broken down and you can't fix it, because you don't have a machine that will interpret the failure codes. The manufacturer will only provide those codes to their own shops.
After complaints, the manufacturer offers free roadside assistance.
That's laudable. Give them snaps for
Re: (Score:3)
"Ah, I finally get to use a car analogy!"
Umm, why is the car in your analogy *used*? At no point is this a requirement.
"Your car has broken down and you can't fix it"
Apparently you haven't *actually read* what MS is doing.
MS is securing their communications infrastructure. This has nothing to do with their products or software.The FSF complaint is *completely bogus*.
A somewhat better analogy might be "My neighbour's house was broken into because they had poor quality locks on the door, so I'm going to chang
Re: (Score:2)
A somewhat better analogy might be "My neighbour's house was broken into because they had poor quality locks on the door, so I'm going to change my locks for better models." The quality of your silverware is unrelated to the actions being taken.
To go with your analogy, it'd be more like :
The company that built the houses in your subdivision put shitty locks on the houses and installed them improperly.
Your neighbor's house just got broken into because of this.
The construction company is now going through the subdivision and replacing all of the locks with a new, better lock.
The FSF's position is this :
That's nice and all, but we don't trust you to pick a good lock and put it on correctly this time.
If we cannot inspect the job you did and the lock y
Re: (Score:2)
If we cannot inspect the job you did and the lock you chose, there's no way for us to know if the house is actually secure to our satisfaction.
So do you feel the same way about say, Google? Have you inspected the locks on their infrastructure? Up until recently they weren't encrypting traffic between their datacenters at all. Actually I'd be interested to know which company's communications infrastructure you have inspected the security implementation of.
Re: (Score:2)
If we cannot inspect the job you did and the lock you chose, there's no way for us to know if the house is actually secure to our satisfaction.
So do you feel the same way about say, Google?
i never said i felt that way at all. I was providing an analogy that more correctly described the FSF's position, not mine. Do you disagree that the analogy represents the FSF position accurately?
Have you inspected the locks on their infrastructure? Up until recently they weren't encrypting traffic between their datacenters at all. Actually I'd be interested to know which company's communications infrastructure you have inspected the security implementation of.
well, the only one you would have heard of would be apple.
Re: (Score:2)
i never said i felt that way at all. I was providing an analogy that more correctly described the FSF's position, not mine. Do you disagree that the analogy represents the FSF position accurately?
My point is that if that is indeed an accurate representation of their point then they are advocating the position that you should not use anything that you don't own and control, because to do so would require you trust that the owner/maintainer has installed the locks properly and that they haven't been tampered with.
well, the only one you would have heard of would be apple.
And what exactly did you inspect and verify?
Re: (Score:2)
i never said i felt that way at all. I was providing an analogy that more correctly described the FSF's position, not mine. Do you disagree that the analogy represents the FSF position accurately?
My point is that if that is indeed an accurate representation of their point then they are advocating the position that you should not use anything that you don't own and control, because to do so would require you trust that the owner/maintainer has installed the locks properly and that they haven't been tampered with.
yeah. i'd say that is probably a fairly accurate description of the FSF position.
well, the only one you would have heard of would be apple.
And what exactly did you inspect and verify?
well, i worked at apple. i had servers in the data centers so i had to know and adhere to all of the security policies relating to the data centers.
Re:Predictable (Score:5, Insightful)
No, Microsoft *claims* to do something nobody could object to -- you're missing the whole point of the statement.
If Microsoft told you they were implementing security and it turned out they were using DES with a key hashed from the word 'Scroogled', would you be pleased? What if they're using good encryption but the keys never rotate? What if the keys rotate but they're on a fixed loop of 16 keys? How would you know?
As an everyday non-programmer, a casual user wouldn't know the difference either way. If however that user is on a fully open source operating system, they at least know that -some- others using that system have had a peek under the hood and still trusted it.
Re: (Score:2)
In order of increasing goodness:
1) Microsoft makes no promise about encrypting data whatsoever
2) Microsoft encrypts data weakly, keeps code proprietary
3) Microsoft encrypts data strongly, keeps code proprietary
4) Microsoft encrypts data strongly, open sources relevant code so community can validate it
So Microsoft announces they're going from 1 to 3. You're paranoid and saying maybe they're going from 1 to 2. Fine.
But here's the thing: it's still IMPROVEMENT. Maybe it's not as much improvement as you want, b
Re: (Score:2)
Microsoft says they're doing 3 but you have no way of knowing that they are. Why do you believe them? We have good data (recent NSA leaks, etc.) that companies suck at strong encryption. Sometimes on purpose.
Re: (Score:2)
well, the only way that microsoft could be doing that server-side would be to pass the key to the server from the client, which you could see in the source code. It doesn't tell you what they are doing with it on the server, but there's no reason to store the encrypted data and the key on the server unless you intend decrypt it without the involvement of the client, and that'd be a big red flag that something questionable was happening.
Re: (Score:2)
Having access to the source means you *know* how your data is stored and you *know* if the data even can be shared.
The source code can show you that the keys are handled client-side only, that they're not leaked in any way, and that the server has no way of giving away your data. You can also see that the keys are generated properly, not from a list or with reduced complexity. These things are all valuable data that you've decided to trust someone to do or fatalistically believe don't matter when they do.
Re: (Score:2)
For this very reason we enforce real IPSec for VPN traffic and GPG/PGP for file transfers with clients and service providers whenever possible.
Sadly, banks seem to be the ones who aren't paying attention, but several of our largest customers only handle data GPG encrypted.
Its worth noting that after sending a user a GPG-encrypted file (symmetric) with the instructions to install and run GPG (including a phone call for me to read them the key I used out of band), they end up often adopting it themselves beca
Re: (Score:3, Insightful)
Good thing that you don't actually need to be particularly pro-open to see that they have a point. No closed software can be considered secure, ever; no steps to assure more security "regardless of promises or even intent" can change that.
"Even if this looks like a good thing, this can't be a good thing because it's proprietary". How can you disagree? They bother making the statement, because it's their mission, and to warn off no
Re:Predictable (Score:4, Insightful)
> So, Microsoft finally does something no geek could object to...
A PR exercise, you mean?
Did I get it wrong or the NSA or some other agency can force a business to reveal its costumers' data AND keep silent about it?
If so, every privacy and encryption statement should include this fact. It doesn't? Then it's a PR exercise.
Do you NOT object to PR exercise about something as delicate as online security? I do.
Re:Predictable (Score:5, Insightful)
So, Microsoft finally does something no geek could object to...
I see what you did there. You tried to insert a faulty premise to support your argument. Any geek worth the title understands that any encryption technology that can not be vetted is, by definition, not trustworthy. So this latest PR stunt by Microsoft is just that, a PR stunt.
Re: (Score:3)
How a datacenter encrypts its data is never going to be something the average user can vet, ever. No user should even have access to that data, which is why it wasnt encrypted to begin with: You need to have some pretty solid connections to manage getting access to that stuff.
Theres also no way to vet whether the keys are being provided to a third party, whether or not the backend software is FOSS or not. If Red Hat made the same announcement, there would be no reason the same "objection" couldnt apply.
Re: (Score:2)
Re: (Score:2)
Actually - that their software is open is irrelevant to the problem. Are they running their own servers with openssl/openvpn/??? or using third party appliances? Did THEY create and build the hardware from the ground up or purchase it from a third party? The balance of probabilities may say their inter-DC encryption is done on a secure, up-to-date and built-and-operated-to-best-practices RH server, but it's not a guarantee.
And just like this scenario with Microsoft, how is anyone going to audit the deployme
Re: (Score:2)
Do you understand the lock on your house's front door? Have you inspected every mechanical element to ensure it cannot be compromised? Do you routinely check it for tampering?
Yes. No. And No. It's a dopey comparison, really, but alright, let's play along. A common lockset works on centuries old, well-known principles. It's easy to understand. I know exactly how it can be compromised while appearing to function normally. I don't bother to take it apart and see if it's had it's been set up for a second key (commonly called a master key) because the risk is not worth the hassle. Now, if I had something that I wanted to hide from those who would like to see it, copy it, whatever, wi
Re: (Score:2)
So, Microsoft finally does something no geek could object to
Its a good thing on its own; and I applaud MS for taking this step. It will stop all kinds of potential snooping on our data from malicious 3rd parties.
However, in the context of the NSA being the big snoop that's triggering all this, its worthless. We can safely presume the NSA gets whatever they want from Microsoft whether its encrypted or not.
Micrsoft's ability to provide its users any security versus "legal" searches by the NSA is nil.
There is
Re: (Score:2)
I pay my taxes because I benefit from things like roads and schools and fire departments and such.
Do you get zero benefit from the things your taxes pay for?
Re: (Score:2)
How do they know the code they've been given is the actual code used to generate the shipped binaries?
Can those Enterprise partners compile the code they've been given in order to compare the binaries with the binaries that MS ships?
NSA (Score:2)
If the NSA revelations have taught us anything, it is that journalists, governments, schools, advocacy organizations, companies, and individuals, must be using operating systems whose code can be reviewed and modified without Microsoft or any other third party's blessing. When we don't have that, back doors and privacy violations are inevitable.
No, they have not taught us that. Most of the NSA revelations have been about snooping telecommunications networks. Using open source software would not have made it any different.
Why is free software immune? (Score:4, Insightful)
Though I agree, that a corporation can be forced by an authoritarian government to put a backdoor into their product, I don't believe, open-source software is immune against backdoors either.
There are scores [stackexchange.com] of people with commit-access to Linux kernel, for example. If the NSA — or its counterpart from any other rich country in the world — put their mind to it, they could use any one (or more) of them to weaken the security functionality in there.
It does not need to be obvious — making the /dev/random's output slightly less random, for example, may reduce the time it takes to tap an ssh or ssl connection with this host from many years down to days. Same goes for PGP-keys generated on the affected host... Nor does it need to involve blatant coercion — the committer may simply receive a patch by e-mail with a fix to some other bug or an improvement, and fail to spot the weakening.
It could, in fact, have already been done years ago for all we know. Who knows, if this little problem [slashdot.org] was not deliberately introduced? And even if it was not — who knows, whether various security agencies exploited it from 2006 to 2013 the way Alan Turing et al exploited mistakes of the German radio-operators during WW2 [wikipedia.org]?
Is it easier to plant a backdoor into an open-source project than a closed-source one — and keep it there for a useful period of time? I'm not at all sure, what I'd bet on, to be perfectly honest. Both can done and, by all appearances, both have been done...
Re: (Score:2)
Is it easier to plant a backdoor into an open-source project than a closed-source one â" and keep it there for a useful period of time?
That's a good question. The methods used would necessarily be different. Keeping it there would depend on delaying its discovery and inhibiting its repair, once found. Leaving the discovery issue aside for the moment (number of eyes on the code, etc.), it is much easier to prevent the removal of a back door when the code base is owned by a private organization with identifiable representatives. Should the NSA lean on both the Microsoft and Linux communities to maintain an exploit, Microsoft can be pressured
Re:Why is free software immune? (Score:4, Interesting)
Linux (and BSD) committers are just as identifiable. Although the codebase is open to all, very few people go through it. If it follows the documented coding style, compiles, and "works", there is simply no reason to keep reviewing it — for most people. The Debian hole [slashdot.org] I cited earlier remained open from 2006 to 2013 — more years, than Turing spent working on Enigma.
Maybe, but I would not count on it. Which country would you consider unlikely to cooperate with the US on such matter — without itself being an even greater threat to liberty (like China or Cuba)? The entire Western world's spooks cooperate with the US. As does Russia [whitehouse.gov] — to some extent [dailymail.co.uk], at least. Who would not help their American colleagues in exchange for Americans helping them — a little? Someone like Sweden? Well, they did hit Assange with rape [wikipedia.org] charges, when he made himself an overly tiresome nuisance to the Americans...
In other words, Microsoft, probably, was coerced into it. A similar coercion — or conviction, or fooling — can be applied to an open-source project's participant. Whether it is easier or harder to do, I would not know.
Re: (Score:2)
Re: (Score:2)
People seem to miss that there are employees, in particular field service employees, at all the major vendors who earn a nice second pay check from 3 letter agencies and their employer is none the wiser.
My dad spent a 30 year career in the finance and accounting area of one of the big defense contractors. His areas dealt with a lot of high security clearance stuff and there were always FBI and spooks in their office. They knew they were there, but they had no idea who the spook or agent was. They paid th
Lock argument doesn't hold (Score:4, Insightful)
Let's face it: as far as we know, the door lock manufacturers also have a master key to all our houses. The schematics and design of the lock are not publicly available, and most people lack the skills to know if the schematics they are looking at are secure or not. It's the same with an OS. And while I *could* take the lock apart and figure out how it works, I still wouldn't know if my particular lock were secure or not, because I have not seen enough locks to know if this particular one is good or not.
Anytime this condition arises, we replace our own lack of knowledge with a trust in experts. We have to defer the judgement of security worthiness to an expert we trust, in which case we are again disinter-mediated from knowing if the lock is actually secure or not. We all trust *someone* with very specific knowledge to help us make decisions, whether that be medical, scientific, security or otherwise, and in each of those cases, we can find examples of where the expert has let us down.
Re:Lock argument doesn't hold (Score:4, Insightful)
Flawed comparison. In fact, locks are much more like open-source software.
Locks can be disassembled and people can review the design. Much like open source software, most people would not be able to tell if a lock design was secure, but enough independent experts can disassemble a lock and review its security.
Yes, you are reliant on experts for the truth about lock security, but you are not reliant solely on the manufacturer's assertions, which is the case with clsoed-source software.
Re: (Score:2)
Why would one trust a lock manufacturer? It's because earning and maintaining our trust serves the manufacturer's commercial interest.
In that vein, I recently dumped TrueCrypt for a similar commercial product. I don't honestly know which of the two is more trustworthy. I suppose I could audit the code for TrueCrypt myself, but I'm not qualified to do that. Or I could trust auditors with the upcoming TrueCrypt audit, whenever that happens. Or I could buy a commercial product and trust the vendor. The l
Re: (Score:2)
FSF is full of themself (Score:2, Insightful)
It really is arrogant of FSF to imply that a user trusting one or a small group of individuals running an opensource project is somehow better off and more secure than microsoft.
Unless a user audits the code, compiles the code (with a known to be good compiler) and manages all elements of the server and routing, there is NO assurance of security or privacy. And never mind the fact that few users even compile from source anymore.
Offtopic: why am I being sent to the beta site to post comments? Very annoyin
Re: (Score:2)
Arrogant, but accurate. Sounds like the FSF to me.
Re:FSF is full of themself (Score:5, Insightful)
Security isn't a binary function. Open source is more secure than closed source because many independent people can download the source and review it, many people can build binaries, etc..
Re: (Score:2)
Yes and I can eat ice cream every day too but then I would get fat. Just because code may be available does not mean it has been fully audited or even that the auditors fully understand not only the code but programming in language X with system calls Y to know what is a security risk. How many people working on a project even know the codebase beyond their particular chunk? How many are capable e
A horrible analogy (Score:3)
A lock on your own house to which you do not have the master key is not a security system, it is a jail.
I get his overall point regarding source, I do, and I agree; but it would help his case if he didn't use such broken analogies. If I have a key, and the landlord has a master key, it does not mean I'm in "jail"; he's not going to lock me into my own home because I have a key of my own, just not a master key. It's just that the landlord can get into my home too. It's more like easy-peasy burglary, but "jail" was a rather stupid way to put it.
Re: (Score:2)
It's more like a safe deposit box than a home. And it can be accessed at any time without your consent.
Re: (Score:2)
30 years of "I told you so". (Score:4, Insightful)
If this NSA kerfluffle has amounted to anything, it is a validation of the idea that "Security through obscurity" is as invalid as we've all been told - since the 1980's.
Re: (Score:2)
Re:EFF is tilting at a tank here. (Score:5, Insightful)
Welcome to the good fight -- the FSF has been at it for a long time, and now the EFF realizes that you can't have freedom without knowledge. That is after all why we believe in a free press in the west, right? Whether the press lives up to its obligations or not, the idea is that without full disclosure, people cannot make good decisions.
Re: (Score:2)
"they're basically pissing on the entire box-package software development industry"
Based on an underlying argument that is entirely unrelated.
If MS decided to install encryption on their own telephone system, I'm sure the FSF would put out the same, equally unrelated, press release.
And /. would slavishly run it for them.
Trust (Score:3)
I don't see what's unrelated about the FSF's argument. The debate pretty simple and it goes more or less like this:
MS: Trust us! We're good guys! We'll start using encryption, we promise.
EFF: People should trust what they can verify. Until you have the full details of MS's implementation in front of you, there is no way to be sure they've done it right. And until you have the right to modify the code for yourself, there is no way to be sure that security holes will get patched promptly and correctly.
As fa
Re:Trust (Score:4, Insightful)
As far as I can tell, the counter-arguments against FSF's position boil down to "well I trust {Microsoft, Google, Apple, Oracle} anyway, so there!" and "who cares if you can trust your computing infrastructure anyway, get over it!" If you have something more to add to those illuminating arguments, please do so.
In fairness I think the counter argument is a little more nuanced than you're representing it. It's more along the lines of: non-programmers are in no position to verify that things have been done correctly even if the program is open source. And even experienced programmers can't, as a practical matter, be expected to meticulously review the millions of lines of code that goes into the various programs they use, nor are they likely to build all of their own software from source all the time. So realistically, even if the software is open source you still have to trust some else to verify it. All open source does is change who the person is that your'e trusting from Microsoft to $YOUR_FAVOURITE_FREE_SOFTWARE_GROUP.
Now perhaps you trust the general open source community more than you trust Microsoft (or Google or Apple or whoever). That's perfectly fine. But I can certainly see how a reasonable person could look at that position and go "why should I trust random strangers on the internet if I'm not willing to trust Microsoft?". Now perhaps that's not good argument. But I think it's at least a little bit more substantive than the strawman you've presented.
Re: (Score:3)
Thank you for the insight into what until now seemed a baffling and unreasonable position.
I think the FSF (and my) argument would carry more weight, then, if we were to replace the phrase "random strangers on the Internet" with "independent experts." Everyone can appreciate the value of having independent experts review a system; and, the refusal of a company to expose its software to independent review should be grounds for suspicion.
Re: (Score:3)
Agreed. But then there are the follow up considerations of
a) Is it the case that open source software is in fact being subject to subjected to scrutiny by independent experts? I would say that certainly some of it is, but I would hazard a guess that not all of it is.
b) How does an uninformed laymen differentiate between an "independent expert" and a "random stranger on the internet". In the absence of doing actual research it's much easier for people outside the field to simply trust the blue chip for
Re: (Score:3)
I think the FSF's concern is much more about what level of review is possible, than about what level of review is actual. The idea is that if the software has a lot of users and/or has a very important function, then it will attract a lot of scrutiny from its corporate users, from college professors, from hobbyists, governments, etc. If the vendor controls access to the source code, the scope of review is limited to whomever the vendors grants access. It does not take computer expertise to realize that the
Re: (Score:3)
Gutsy, they're basically pissing on the entire box-package software development industry, and no small number of hardware/firmware companies, when they say you can't trust closed-source.
That's not gutsy, that's being Captain Obvious. I won't shed any tears for the "box-package software development industry", though; that's never been the major part of the SW industry, unlike custom-built systems. It's not like there would be massive unemployment if that went away.
Re: (Score:2)