Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Encryption Security Microsoft Open Source

FSF Responds To Microsoft's Privacy and Encryption Announcement 174

An anonymous reader writes "Microsoft announced yesterday their plans to encrypt customer data to prevent government snooping. Free Software Foundation executive director John Sullivan questions the logic of trusting non-free software, regardless of promises or even intent. He says, 'Microsoft has made renewed security promises before. In the end, these promises are meaningless. Proprietary software like Windows is fundamentally insecure not because of Microsoft's privacy policies but because its code is hidden from the very users whose interests it is supposed to secure. A lock on your own house to which you do not have the master key is not a security system, it is a jail. ... If the NSA revelations have taught us anything, it is that journalists, governments, schools, advocacy organizations, companies, and individuals, must be using operating systems whose code can be reviewed and modified without Microsoft or any other third party's blessing. When we don't have that, back doors and privacy violations are inevitable.'"
This discussion has been archived. No new comments can be posted.

FSF Responds To Microsoft's Privacy and Encryption Announcement

Comments Filter:
  • PR Stunt at best (Score:5, Interesting)

    by jbmartin6 ( 1232050 ) on Friday December 06, 2013 @09:56AM (#45618001)
    How is encrypting data in motion going to help when they will simply provide the NSA the keys or otherwise provide access to the data. They are just another participant in the 'we never provided direct access' lie, when you simply provide everything on demand they don't need direct access, nor do they need to decrypt data off the wire.
    • by twocows ( 1216842 ) on Friday December 06, 2013 @10:21AM (#45618167)
      Not just that, but what the FSF spokesman is saying here is essentially right (though I think they could do with a bit less imagery, it makes it seem like they're just pushing their agenda, not that I disagree with it). How are we supposed to verify that Microsoft is even keeping its promise if we don't have access to the source? They could just be paying it lip service and not really doing anything about it. Or, they could be incompetent (MS, incompetent? what a novel idea). Or they might just make a token attempt at getting things "kinda sorta" secure (or at least looking secure). Again, how can we trust that they're following through? If it was free software, there's the capacity for anyone to audit it and make sure it's secure (and if it's not, there are more ways to deal with it than "annoy MS until they fix it").
      • by s.petry ( 762400 )

        In fairness, it would not require "free software" to accomplish the openness. It would however require the source code for the encrypting software to be freely available to review, inspect, compile, and compare to what is installed.

        "Free" software does this for you by nature, but a company could do the same thing. Microsoft "won't", but absolutely "could". Sun did it, HP has done it, IBM has done it, Cisco has done it, etc.. etc...

        Microsoft would not do this however, because it would open up the nasty cr

        • by Yvanhoe ( 564877 )
          To be certain of not being snooped, you would also need to use open source tool to generate the content you want to send and run that on an open source OS which guarantees that other process won't have access to the cleartext message you wrote.

          And here I am assuming no backdoor in hardware or firmware, which in 2013 is quite a leap of faith.

          Open Source has a hard time providing the minimum trustable stack (BIOS is the current obvious weak link) and I don't see microsoft doing that any time soon.
          • by s.petry ( 762400 )
            You made a bit more clear what I intended with being able to inspect what is compiled versus what is installed.
      • by Bert64 ( 520050 )

        Even with access to the source, we're talking about running services rather than code you run on your own hardware. There is no reason to believe that the source they provide is the same as they're running, and there's no way to tell who else has access to their systems.

        Most other big providers such as google and yahoo run most of their stuff on open source software, so while we have the code we have no way to tell what they're doing with it.

    • Look, we are all worried over nothing; this encryption means that only one 256 bit Key will unlock your data, or a paperclip -- but it's merely coincidence that the encryption has a two locks on the door and one of them is always the same.

      I'm waiting to see Clippie on an upcoming episode of "VH1; Where are they now?"

  • Who cares? (Score:2, Insightful)

    by Anonymous Coward

    Who cares if the software is non-free? That's not even the issue.

    "Microsoft announced yesterday their plans to encrypt customer data to prevent government snooping. "

    And I bet Microsoft will just hand over the encryption keys / passwords to the NSA.

    • Re:Who cares? (Score:5, Insightful)

      by Chrisq ( 894406 ) on Friday December 06, 2013 @10:00AM (#45618023)

      Who cares if the software is non-free? That's not even the issue.

      You are correct, the issue is that it must be open source and build-able from source.

      • Re: (Score:2, Insightful)

        by Anonymous Coward

        Who cares if the software is non-free? That's not even the issue.

        You are correct, the issue is that it must be free software and build-able from source.

        FTFY.

        • Re: (Score:2, Troll)

          by unixisc ( 2429386 )
          Stupid Stallmanesque semantics, as usual. In this case, the idea behind making the source code available is better software, not more liberated software. I mean, the NSA is only to happy to share backdoors w/ everybody
      • No, this is not enough. Not if you want an optimal level of privacy and security. If the software is open-source but non-free, then you can fix it all you want, but you cannot share your fix with others. So this is as good as closed source for almost everyone, including you, since you cannot fix all the bugs by yourself.
    • Re:Who cares? (Score:5, Insightful)

      by jones_supa ( 887896 ) on Friday December 06, 2013 @10:30AM (#45618265)

      And I bet Microsoft will just hand over the encryption keys / passwords to the NSA.

      Things like these are still a step forward, as NSA has to actually ask for the keys from companies, instead of just passively snooping everywhere it wants to.

      • Though it's worth noting that Microsoft has a history of being particularly inept in implementing encryption. Best intentions, sadly, does not make for secure code.

    • by PPH ( 736903 )

      And I bet Microsoft will just hand over the encryption keys / passwords to the NSA.

      Why does Microsoft even need my private key? Take e-mail, for example. I have a private key locally and a public key that I share with those needing to correspond with me. Someone needs to send me a message, they look up my pubkey, encrypt their message and send it through the tubes. I decrypt it upon reciept using my privkey. Why is Microsoft not in the business of managing public keys for its users and forwarding messages? That's basically all we need. Its the founding principle of the Internet. Push all

    • How is an FOSS encryption program the answer? Since it's open source, even NSA personnel will be able to read the encryption algorithms and design software to decrypt it, and snoop. Given that, the cure seems worse than the disease.
      • Re:Who cares? (Score:4, Informative)

        by hawkinspeter ( 831501 ) on Friday December 06, 2013 @12:17PM (#45619261)
        You seem to be confusing good security design and security through obscurity. A good encryption algorithm is still a good encryption algorithm when it's generally known how it works as it would rely on a separate "secret" or "key". Like a house door - I can know how it works, but without the key it's not going to be easy to open.

        Bad security uses "security through obscurity". Those types of systems become useless once you know how they work. Examples of this would include puzzle locks, ROT13 encryption etc.
  • Predictable (Score:3, Insightful)

    by donscarletti ( 569232 ) on Friday December 06, 2013 @10:01AM (#45618031)

    So, Microsoft finally does something no geek could object to and the FSF's response is "even if this looks like a good thing, this can't be a good thing because it's proprietary". It just makes me wonder why they bother making a statement; it's proprietary, it always is and it always has been.

    • Re:Predictable (Score:4, Insightful)

      by Sockatume ( 732728 ) on Friday December 06, 2013 @10:09AM (#45618081)

      "Without access, you can only take them on trust" would seem to be the FSF's actual argument. I don't honestly believe that people would actually compile all their tools from source code they've reviewed personally to check for security holes, but at least represent their argument accurately.

      • Re:Predictable (Score:5, Interesting)

        by JustNiz ( 692889 ) on Friday December 06, 2013 @10:45AM (#45618413)

        >> I don't honestly believe that people would actually compile all their tools from source code they've reviewed personally to check for security holes

        We do use some open source in our aviation products. We are required to heavily review literally every line of source code (both ours and open source) in order to get our product certified for aircraft use.

        • Some industries and experts can do this, but for the great majority of users trusting open source is quite similar to trusting a commercial vendor. The NSA has teh resources to flood discussion groups and review sites with posts that generate a false sense of security if they choose to do so.

          In the end I think the LAW is the only thing that can provide protection.

          • by JustNiz ( 692889 )

            >> but for the great majority of users trusting open source is quite similar to trusting a commercial vendor.

            Not at all. The point with opensource is you at least have the freedom to look at the code, (whether you choose to invest the time required or not is up to you). Also, chances are if something nefarious is in the code, someone working on the project will spot it and it will be outed.

            None of the above is true with commercial closed code, especially from big companies like Microsoft who have alre

        • You can review it all you like, that doesn't mean you understand it. For example there have been nearly decade-old vulnerabilities found in the Linux kernel.
          • by JustNiz ( 692889 )

            The fact you know about those vulnerabilities in the first place exactly makes my point for me.

            • The fact you know about those vulnerabilities in the first place exactly makes my point for me.

              No it doesn't, not at all. It actually proves the point that you can review it all you want and there will still be bugs, likely some that exist right now that won't even be discovered for another decade.

      • Although not everyone has time to inspect all the software they use, it's important that people have that option available. I'd rather trust independant security researchers and open source code reviewers than just trusting Microsoft with no other option available.

        If I had any reason to distrust some software, then I could always pay someone to perform an audit/code review and see what's going on (e.g. TrueCrypt has been inspected since the NSA relevations to see if the binaries are different to the publi
        • by Darth ( 29071 )

          In addition to your points, the option for people to look at your code makes your code better because it makes you more diligent when you write it.

          I suspect everyone has had a conversation like this :
          Bob : check out my awesome-sauce application. it's bad ass
          Boss : cool. give Jeff access to the source code. i'd like him to integrate it into our Fabulosity suite.
          Bob : er, ok. just give me a couple of days to clean up the code so it is ready for integration.
          (translation, give me a couple days to fix all the fu

    • > So, Microsoft finally does something no geek could object to and the FSF's response is "even if this looks like a good thing, this can't be a good thing because it's proprietary".

      Ah, I finally get to use a car analogy!

      Your car has broken down and you can't fix it, because you don't have a machine that will interpret the failure codes. The manufacturer will only provide those codes to their own shops.

      After complaints, the manufacturer offers free roadside assistance.

      That's laudable. Give them snaps for

      • "Ah, I finally get to use a car analogy!"

        Umm, why is the car in your analogy *used*? At no point is this a requirement.

        "Your car has broken down and you can't fix it"

        Apparently you haven't *actually read* what MS is doing.

        MS is securing their communications infrastructure. This has nothing to do with their products or software.The FSF complaint is *completely bogus*.

        A somewhat better analogy might be "My neighbour's house was broken into because they had poor quality locks on the door, so I'm going to chang

        • by Darth ( 29071 )

          A somewhat better analogy might be "My neighbour's house was broken into because they had poor quality locks on the door, so I'm going to change my locks for better models." The quality of your silverware is unrelated to the actions being taken.

          To go with your analogy, it'd be more like :

          The company that built the houses in your subdivision put shitty locks on the houses and installed them improperly.
          Your neighbor's house just got broken into because of this.
          The construction company is now going through the subdivision and replacing all of the locks with a new, better lock.

          The FSF's position is this :
          That's nice and all, but we don't trust you to pick a good lock and put it on correctly this time.
          If we cannot inspect the job you did and the lock y

          • If we cannot inspect the job you did and the lock you chose, there's no way for us to know if the house is actually secure to our satisfaction.

            So do you feel the same way about say, Google? Have you inspected the locks on their infrastructure? Up until recently they weren't encrypting traffic between their datacenters at all. Actually I'd be interested to know which company's communications infrastructure you have inspected the security implementation of.

            • by Darth ( 29071 )

              If we cannot inspect the job you did and the lock you chose, there's no way for us to know if the house is actually secure to our satisfaction.

              So do you feel the same way about say, Google?

              i never said i felt that way at all. I was providing an analogy that more correctly described the FSF's position, not mine. Do you disagree that the analogy represents the FSF position accurately?

              Have you inspected the locks on their infrastructure? Up until recently they weren't encrypting traffic between their datacenters at all. Actually I'd be interested to know which company's communications infrastructure you have inspected the security implementation of.

              well, the only one you would have heard of would be apple.

              • i never said i felt that way at all. I was providing an analogy that more correctly described the FSF's position, not mine. Do you disagree that the analogy represents the FSF position accurately?

                My point is that if that is indeed an accurate representation of their point then they are advocating the position that you should not use anything that you don't own and control, because to do so would require you trust that the owner/maintainer has installed the locks properly and that they haven't been tampered with.

                well, the only one you would have heard of would be apple.

                And what exactly did you inspect and verify?

                • by Darth ( 29071 )

                  i never said i felt that way at all. I was providing an analogy that more correctly described the FSF's position, not mine. Do you disagree that the analogy represents the FSF position accurately?

                  My point is that if that is indeed an accurate representation of their point then they are advocating the position that you should not use anything that you don't own and control, because to do so would require you trust that the owner/maintainer has installed the locks properly and that they haven't been tampered with.

                  yeah. i'd say that is probably a fairly accurate description of the FSF position.

                  well, the only one you would have heard of would be apple.

                  And what exactly did you inspect and verify?

                  well, i worked at apple. i had servers in the data centers so i had to know and adhere to all of the security policies relating to the data centers.

    • Re:Predictable (Score:5, Insightful)

      by MikeBabcock ( 65886 ) <mtb-slashdot@mikebabcock.ca> on Friday December 06, 2013 @10:15AM (#45618123) Homepage Journal

      No, Microsoft *claims* to do something nobody could object to -- you're missing the whole point of the statement.

      If Microsoft told you they were implementing security and it turned out they were using DES with a key hashed from the word 'Scroogled', would you be pleased? What if they're using good encryption but the keys never rotate? What if the keys rotate but they're on a fixed loop of 16 keys? How would you know?

      As an everyday non-programmer, a casual user wouldn't know the difference either way. If however that user is on a fully open source operating system, they at least know that -some- others using that system have had a peek under the hood and still trusted it.

      • In order of increasing goodness:

        1) Microsoft makes no promise about encrypting data whatsoever
        2) Microsoft encrypts data weakly, keeps code proprietary
        3) Microsoft encrypts data strongly, keeps code proprietary
        4) Microsoft encrypts data strongly, open sources relevant code so community can validate it

        So Microsoft announces they're going from 1 to 3. You're paranoid and saying maybe they're going from 1 to 2. Fine.

        But here's the thing: it's still IMPROVEMENT. Maybe it's not as much improvement as you want, b

        • Microsoft says they're doing 3 but you have no way of knowing that they are. Why do you believe them? We have good data (recent NSA leaks, etc.) that companies suck at strong encryption. Sometimes on purpose.

    • Re: (Score:3, Insightful)

      by foma84 ( 2079302 )
      Yes, I immagine that from an anti-open perspective it does sound like that.
      Good thing that you don't actually need to be particularly pro-open to see that they have a point. No closed software can be considered secure, ever; no steps to assure more security "regardless of promises or even intent" can change that.
      "Even if this looks like a good thing, this can't be a good thing because it's proprietary". How can you disagree? They bother making the statement, because it's their mission, and to warn off no
    • Re:Predictable (Score:4, Insightful)

      by marcello_dl ( 667940 ) on Friday December 06, 2013 @10:21AM (#45618169) Homepage Journal

      > So, Microsoft finally does something no geek could object to...

      A PR exercise, you mean?

      Did I get it wrong or the NSA or some other agency can force a business to reveal its costumers' data AND keep silent about it?
      If so, every privacy and encryption statement should include this fact. It doesn't? Then it's a PR exercise.

      Do you NOT object to PR exercise about something as delicate as online security? I do.

    • Re:Predictable (Score:5, Insightful)

      by Jawnn ( 445279 ) on Friday December 06, 2013 @10:25AM (#45618211)

      So, Microsoft finally does something no geek could object to...

      I see what you did there. You tried to insert a faulty premise to support your argument. Any geek worth the title understands that any encryption technology that can not be vetted is, by definition, not trustworthy. So this latest PR stunt by Microsoft is just that, a PR stunt.

      • How a datacenter encrypts its data is never going to be something the average user can vet, ever. No user should even have access to that data, which is why it wasnt encrypted to begin with: You need to have some pretty solid connections to manage getting access to that stuff.

        Theres also no way to vet whether the keys are being provided to a third party, whether or not the backend software is FOSS or not. If Red Hat made the same announcement, there would be no reason the same "objection" couldnt apply.

        • You're right, but with Red Hat, they hardly even need to make the statement as their software is transparently open and lots of people have agreed on how it should work. Also, with most FOSS software, you have the option of hosting it yourself and being in control of the keys.
          • Actually - that their software is open is irrelevant to the problem. Are they running their own servers with openssl/openvpn/??? or using third party appliances? Did THEY create and build the hardware from the ground up or purchase it from a third party? The balance of probabilities may say their inter-DC encryption is done on a secure, up-to-date and built-and-operated-to-best-practices RH server, but it's not a guarantee.

            And just like this scenario with Microsoft, how is anyone going to audit the deployme

    • by vux984 ( 928602 )

      So, Microsoft finally does something no geek could object to

      Its a good thing on its own; and I applaud MS for taking this step. It will stop all kinds of potential snooping on our data from malicious 3rd parties.

      However, in the context of the NSA being the big snoop that's triggering all this, its worthless. We can safely presume the NSA gets whatever they want from Microsoft whether its encrypted or not.

      Micrsoft's ability to provide its users any security versus "legal" searches by the NSA is nil.

      There is

  • If the NSA revelations have taught us anything, it is that journalists, governments, schools, advocacy organizations, companies, and individuals, must be using operating systems whose code can be reviewed and modified without Microsoft or any other third party's blessing. When we don't have that, back doors and privacy violations are inevitable.

    No, they have not taught us that. Most of the NSA revelations have been about snooping telecommunications networks. Using open source software would not have made it any different.

  • by mi ( 197448 ) <slashdot-2017q4@virtual-estates.net> on Friday December 06, 2013 @10:45AM (#45618421) Homepage Journal

    must be using operating systems whose code can be reviewed and modified without Microsoft or any other third party's blessing

    Though I agree, that a corporation can be forced by an authoritarian government to put a backdoor into their product, I don't believe, open-source software is immune against backdoors either.

    There are scores [stackexchange.com] of people with commit-access to Linux kernel, for example. If the NSA — or its counterpart from any other rich country in the world — put their mind to it, they could use any one (or more) of them to weaken the security functionality in there.

    It does not need to be obvious — making the /dev/random's output slightly less random, for example, may reduce the time it takes to tap an ssh or ssl connection with this host from many years down to days. Same goes for PGP-keys generated on the affected host... Nor does it need to involve blatant coercion — the committer may simply receive a patch by e-mail with a fix to some other bug or an improvement, and fail to spot the weakening.

    It could, in fact, have already been done years ago for all we know. Who knows, if this little problem [slashdot.org] was not deliberately introduced? And even if it was not — who knows, whether various security agencies exploited it from 2006 to 2013 the way Alan Turing et al exploited mistakes of the German radio-operators during WW2 [wikipedia.org]?

    Is it easier to plant a backdoor into an open-source project than a closed-source one — and keep it there for a useful period of time? I'm not at all sure, what I'd bet on, to be perfectly honest. Both can done and, by all appearances, both have been done...

    • by PPH ( 736903 )

      Is it easier to plant a backdoor into an open-source project than a closed-source one â" and keep it there for a useful period of time?

      That's a good question. The methods used would necessarily be different. Keeping it there would depend on delaying its discovery and inhibiting its repair, once found. Leaving the discovery issue aside for the moment (number of eyes on the code, etc.), it is much easier to prevent the removal of a back door when the code base is owned by a private organization with identifiable representatives. Should the NSA lean on both the Microsoft and Linux communities to maintain an exploit, Microsoft can be pressured

      • by mi ( 197448 ) <slashdot-2017q4@virtual-estates.net> on Friday December 06, 2013 @01:04PM (#45619723) Homepage Journal

        it is much easier to prevent the removal of a back door when the code base is owned by a private organization with identifiable representatives

        Linux (and BSD) committers are just as identifiable. Although the codebase is open to all, very few people go through it. If it follows the documented coding style, compiles, and "works", there is simply no reason to keep reviewing it — for most people. The Debian hole [slashdot.org] I cited earlier remained open from 2006 to 2013 — more years, than Turing spent working on Enigma.

        In the Linux community, being international, such pressure would be more difficult to apply.

        Maybe, but I would not count on it. Which country would you consider unlikely to cooperate with the US on such matter — without itself being an even greater threat to liberty (like China or Cuba)? The entire Western world's spooks cooperate with the US. As does Russia [whitehouse.gov] — to some extent [dailymail.co.uk], at least. Who would not help their American colleagues in exchange for Americans helping them — a little? Someone like Sweden? Well, they did hit Assange with rape [wikipedia.org] charges, when he made himself an overly tiresome nuisance to the Americans...

        Its interesting to note that Microsoft's anti trust settlement was negotiated and overseen by a member of the FISA court. The mandate to open APIs and source probably stopped short of revealing all the built-in back doors.

        In other words, Microsoft, probably, was coerced into it. A similar coercion — or conviction, or fooling — can be applied to an open-source project's participant. Whether it is easier or harder to do, I would not know.

    • People seem to miss that there are employees, in particular field service employees, at all the major vendors who earn a nice second pay check from 3 letter agencies and their employer is none the wiser.

      My dad spent a 30 year career in the finance and accounting area of one of the big defense contractors. His areas dealt with a lot of high security clearance stuff and there were always FBI and spooks in their office. They knew they were there, but they had no idea who the spook or agent was. They paid th

  • by t'mbert ( 301531 ) on Friday December 06, 2013 @10:51AM (#45618463)

    Let's face it: as far as we know, the door lock manufacturers also have a master key to all our houses. The schematics and design of the lock are not publicly available, and most people lack the skills to know if the schematics they are looking at are secure or not. It's the same with an OS. And while I *could* take the lock apart and figure out how it works, I still wouldn't know if my particular lock were secure or not, because I have not seen enough locks to know if this particular one is good or not.

    Anytime this condition arises, we replace our own lack of knowledge with a trust in experts. We have to defer the judgement of security worthiness to an expert we trust, in which case we are again disinter-mediated from knowing if the lock is actually secure or not. We all trust *someone* with very specific knowledge to help us make decisions, whether that be medical, scientific, security or otherwise, and in each of those cases, we can find examples of where the expert has let us down.

    • by whoever57 ( 658626 ) on Friday December 06, 2013 @12:53PM (#45619625) Journal

      Let's face it: as far as we know, the door lock manufacturers also have a master key to all our houses. The schematics and design of the lock are not publicly available, and most people lack the skills to know if the schematics they are looking at are secure or not.

      Flawed comparison. In fact, locks are much more like open-source software.

      Locks can be disassembled and people can review the design. Much like open source software, most people would not be able to tell if a lock design was secure, but enough independent experts can disassemble a lock and review its security.

      Yes, you are reliant on experts for the truth about lock security, but you are not reliant solely on the manufacturer's assertions, which is the case with clsoed-source software.

    • Why would one trust a lock manufacturer? It's because earning and maintaining our trust serves the manufacturer's commercial interest.

      In that vein, I recently dumped TrueCrypt for a similar commercial product. I don't honestly know which of the two is more trustworthy. I suppose I could audit the code for TrueCrypt myself, but I'm not qualified to do that. Or I could trust auditors with the upcoming TrueCrypt audit, whenever that happens. Or I could buy a commercial product and trust the vendor. The l

    • Pin tumbler locks are actually very simple devices. They're quite easy to disassemble. You can physically inspect the pins and see that there are only regular bottom pins (no master pins). That being said, they're extremely easy to pick. I saw a locksmith pick one (the kind you find on your front door) in seconds with a pick gun (it "bounces" the pins to the shear line). On a separate note, I'm posting this purely for technical interest (/. is news for nerds after all). I do agree with your argument
  • It really is arrogant of FSF to imply that a user trusting one or a small group of individuals running an opensource project is somehow better off and more secure than microsoft.

    Unless a user audits the code, compiles the code (with a known to be good compiler) and manages all elements of the server and routing, there is NO assurance of security or privacy. And never mind the fact that few users even compile from source anymore.

    Offtopic: why am I being sent to the beta site to post comments? Very annoyin

    • Arrogant, but accurate. Sounds like the FSF to me.

    • by whoever57 ( 658626 ) on Friday December 06, 2013 @12:59PM (#45619685) Journal

      Unless a user audits the code, compiles the code (with a known to be good compiler) and manages all elements of the server and routing, there is NO assurance of security or privacy. And never mind the fact that few users even compile from source anymore.

      Security isn't a binary function. Open source is more secure than closed source because many independent people can download the source and review it, many people can build binaries, etc..

      • because many independent people can download the source and review it, many people can build binaries, etc..

        Yes and I can eat ice cream every day too but then I would get fat. Just because code may be available does not mean it has been fully audited or even that the auditors fully understand not only the code but programming in language X with system calls Y to know what is a security risk. How many people working on a project even know the codebase beyond their particular chunk? How many are capable e

  • by cyberchondriac ( 456626 ) on Friday December 06, 2013 @12:53PM (#45619621) Journal

    A lock on your own house to which you do not have the master key is not a security system, it is a jail.

    I get his overall point regarding source, I do, and I agree; but it would help his case if he didn't use such broken analogies. If I have a key, and the landlord has a master key, it does not mean I'm in "jail"; he's not going to lock me into my own home because I have a key of my own, just not a master key. It's just that the landlord can get into my home too. It's more like easy-peasy burglary, but "jail" was a rather stupid way to put it.

  • by jafac ( 1449 ) on Friday December 06, 2013 @01:41PM (#45620087) Homepage

    If this NSA kerfluffle has amounted to anything, it is a validation of the idea that "Security through obscurity" is as invalid as we've all been told - since the 1980's.

    • by Altanar ( 56809 )
      No, if this NSA kerfluffle has amounted to anything, it's that open source software and open standards only give users a false sense of security if no one is willing to audit the software. See: The Dual EC_DRNG algorithm.

Get hold of portable property. -- Charles Dickens, "Great Expectations"

Working...