Forgot your password?
typodupeerror
Encryption Security IT

Cryptography 'Becoming Less Important,' Adi Shamir Says 250

Posted by Soulskill
from the encryption-is-useless-when-users-are-also-useless dept.
Trailrunner7 writes "In the current climate of continuous attacks and intrusions by APT crews, government-sponsored groups and others organizations, cryptography is becoming less and less important, one of the fathers of public-key cryptography said Tuesday. Adi Shamir, who helped design the original RSA algorithm, said that security experts should be preparing for a 'post-cryptography' world. 'I definitely believe that cryptography is becoming less important. In effect, even the most secure computer systems in the most isolated locations have been penetrated over the last couple of years by a series of APTs and other advanced attacks,' Shamir said during the Cryptographers' Panel session at the RSA Conference today. 'We should rethink how we protect ourselves. Traditionally we have thought about two lines of defense. The first was to prevent the insertion of the APT with antivirus and other defenses. The second was to detect the activity of the APT once it's there. But recent history has shown us that the APT can survive both of these defenses and operate for several years.""
This discussion has been archived. No new comments can be posted.

Cryptography 'Becoming Less Important,' Adi Shamir Says

Comments Filter:
  • He put the S in RSA (Score:5, Interesting)

    by Anonymous Coward on Tuesday February 26, 2013 @09:03PM (#43020449)

    Without him, it'd just be RA, which isn't even RAD.

  • no (Score:5, Insightful)

    by masternerdguy (2468142) on Tuesday February 26, 2013 @09:04PM (#43020453)
    Encryption is the best anti-tampering mechanism you have in computing. Well placed encryption protects OS data from tampering, user data from theft, and sensitive communications secured. It's only getting more important.
    • Re:no (Score:5, Insightful)

      by happylight (600739) on Tuesday February 26, 2013 @09:24PM (#43020597)
      I think the point is no encryption is going to protect you from users installing malware, buggy software, or just plain hand over data unknowingly. Next to no attackers would attack the cryptography itself. The weakest link is always somewhere else.
      • Re:no (Score:5, Insightful)

        by masternerdguy (2468142) on Tuesday February 26, 2013 @09:28PM (#43020621)
        Crypto is part of a full solution containing (crypto), proper segregation of permissions, proper segregation of user data / accounts, proper firewall configuration, proper software configuration, patching vulnurabilities, malware detection (lots of solutions on Windows, chkrootkit on linux), and user education. If I forgot anything add it to the list.
        • Re:no (Score:5, Insightful)

          by swilde23 (874551) on Tuesday February 26, 2013 @10:08PM (#43020849) Journal

          user education should be printed in all caps, bold, underlined, comic sans, etc...

          At some point, unless we develop new algorithms that utterly break how current encryption algorithms behave (which I know I know, is a possibility... and of course the NSA has it already)... your weakest point is not going to be the computer. It's going to be the lackey at the front-desk happily letting a "tech" in (physically or electronically)

          • Re:no (Score:5, Insightful)

            by demonlapin (527802) on Tuesday February 26, 2013 @11:32PM (#43021279) Homepage Journal
            This is true but unfortunately irrelevant. You can do all the user education in the world and it means nothing if the IT staff are idiots.

            I have a handful of fairly secure passwords. They're reasonably long, are incredibly easy for me to memorize, and don't rely on any details of my life (pets, wife, kids, birthday, etc.). But I have to deal with websites that demand a series of ridiculous standards: some require (thank you, AmEx) a number in the username, some require passwords to have number, capital letter, and symbol. I spent a lot of damned time figuring out a password that people can't guess, and I can't use it because I can't remember the rules for any random website - so I have to get a password reset email sent to me in plaintext. And on top of that, I can't use a password I've used before - so every time I log into a website I rarely use, I have to reset the password to something I will forget in a few days. I'd use something like Keepass but I need to be able to log in from non-home computers.
            • by dbIII (701233)
              Means nothing if the IT policy is set by idiots.
            • Re:no (Score:4, Interesting)

              by PReDiToR (687141) on Wednesday February 27, 2013 @03:24AM (#43022253) Homepage Journal
              You're looking for Password Hasher [wijjo.com] and if you're not on your own computer the demo [wijjo.com] page will work in (nearly) any browser.

              In case you (or someone else) doesn't click it, if you use your UID as the passphrase and "slashdot" as the site tag you get "i0+v+dXNbzPpvpW177NeV9eYnK" at my default settings of 26 characters, upper, lower, numbers and symbols.
              For remembering just your UID. How simple is that?
              To bump it up and alter the password completely when you change it there is a button that will change "slashdot" to "slashdot:1" - a change that is remembered by your browser, or can be written in a text file as a reminder because that isn't sensitive information.

              This is not perfect security but it would go a long way to making identity theft and account hijacking harder if everyone showed their mother and their kids how to use this simple piece of code. They could go on using that one stupid password that is the only thing they can remember but be secure from rainbow tables and GPUs for a few years.
          • by grumbel (592662)

            user education should be printed in all caps, bold, underlined, comic sans, etc...

            If the user can break the OS, then the OS wasn't secure enough. It should be completely impossible to get the OS into a state where it's unrecoverable or unverifiable. If the OS fails at that, blame the OS, not the user, maybe then we get some progress in computer security.

          • Re: (Score:3, Interesting)

            by tlhIngan (30335)

            user education should be printed in all caps, bold, underlined, comic sans, etc...

            At some point, unless we develop new algorithms that utterly break how current encryption algorithms behave (which I know I know, is a possibility... and of course the NSA has it already)... your weakest point is not going to be the computer. It's going to be the lackey at the front-desk happily letting a "tech" in (physically or electronically)

            First off, any security system designed should account for Dancing Pigs [wikipedia.org] in which se

        • Re:no (Score:4, Insightful)

          by happylight (600739) on Tuesday February 26, 2013 @10:36PM (#43021009)

          What you're referring to is more often called information security. Cryptography usually just refers to the methods, algorithms, and protocols of transferring data.

          But there's little point in arguing the semantics of words. I think we can all agree the human factor plays a large part in almost all intrusions these days.

      • by ceoyoyo (59147)

        The weakest link is the encryption if you don't have any.

        Encryption has just become so important, and so good, that attackers are forced to look elsewhere.

        • When you build a better safe, there will always be a better safecracker. Same applies to encryption.
          • Re:no (Score:4, Insightful)

            by tsotha (720379) on Wednesday February 27, 2013 @12:51AM (#43021679)

            Actually, a safe is a good analogy. Nobody actually "cracks" a safe any more, in the same way criminals don't gain access to your computer by cracking your crypto. Safes are blown open, battered open, cut open, or subjected to some fancy chemical attack. But modern high-quality combination locks are impervious to the guy with nimble fingers and a stethoscope (which is a Hollywood thing anyway).

            Installing a rootkit from an email is roughly analogous to having your safe opened because you put the combination in the top right drawer of the adjacent desk.

      • so if you know the information the enemy will find out through you.

      • Re:no (Score:5, Interesting)

        by grumbel (592662) <grumbel@gmx.de> on Wednesday February 27, 2013 @12:41AM (#43021619) Homepage

        I think the point is no encryption is going to protect you from users installing malware, buggy software, or just plain hand over data unknowingly.

        That's a problem of the current day extremely fragile OS design. Stuff a user installs should simply never have the right to do any damage. Just like a HTML app is strictly sandboxed and can't access your whole HDD, so should a native executable. You don't really have to worry about malware when its locked up in a sandbox and can't even modify itself.

        To make quick Unix example of how things should work:

        Wrong way: sed "s/foo/bar/" file
        Right way: cat file | sed "s/foo/bar/"

        In the first one 'sed' has all the rights the user has and can do whatever it wants behind the users back. In the second case 'sed' needs absolutely no rights at all aside from being able to read stdin and could be completely sandboxed away. It's 'cat' that has the right to access users files and pass the data down the line to other programs. Thus instead of having dozens or hundreds of apps with file access, you have just one. Similar concepts can be adopted to the GUI easily where the file dialog (the GUIs 'cat' equivalent) becomes part of the OS instead of the application.

      • Re:no (Score:5, Interesting)

        by hairyfeet (841228) <bassbeast1968@gma i l . com> on Wednesday February 27, 2013 @03:01AM (#43022193) Journal

        Exactly, its like how a friend of mine was nearly fired because he wouldn't let a PHB have his "files" from his "friend" Melissa, yep the moron was threatening to fire him if he didn't let a worm loose on the network. Lucky for Glenn the guy above the PHB wasn't a retard and actually kept up on current events so he just said "Is he talking about the worm that's going around?" and then gave Glenn a free steak dinner while giving the PHB the riot act for trying to compromise security for an imaginary girl.

        At the end of the day you just can't protect from a case of the stupids, you just can't. I was quite proud of having an unbroken record, nothing but happy customers and well running systems,until I finally had to throw a customer out of the shop and threaten to call the cops, why? because this was right after Limewire had been shut down, I told him flat footed "The courts shut Limewire down, it doesn't exist and anything that says its limewire is either worthless or a malware laden fake" so guess what he did? promptly went home, downloaded "the new limewire" and then demanded i fix the machine for free because...shock... it was nothing but a bunch of malware with the limewire logo. When i threw him out the shop he was saying "it says its limewire now you make it work!"

        Sadly there is only so much you can do without turning the system into nothing but a locked down, corporate controlled thin client and as long as the user has the right to install you are at the whims of somebody who may be a moron. I learned you do the best you can but at the end of the day stupid is as stupid does.

    • by ls671 (1122017)

      Encryption is the best anti-tampering mechanism you have in computing

      Let me disagree; the best anti-tampering mechanism is checksums taken from preferably remote access to the file system from a highly protected host. md5sum and the like are your friend to find 0 day exploit root kits.

      Note that this is line with what Rivest-Shamir-Alderman Adi Shamir is trying to warn us about.

      • by Clarious (1177725)

        MD5 isn't that secure, and AFAIK SHA1 usage is not recommended due to near future threats too. The system you referred to is just the same one as my laptop, with the TPM chip as the 'highly protected host'.

        • Re:no (Score:4, Insightful)

          by ls671 (1122017) on Tuesday February 26, 2013 @10:45PM (#43021047) Homepage

          I don't give a damn about how secure it is, I could even use crc-32 if the snapshot takes too long. The idea is only to be alerted about unexpected file changes, especially system executable like; top, login, w, etc. but you should look wider.

          1) Take periodic checksums
          2) Have differences reported
          3) If they don't match documented updates you have an intruder.

          That's why it is recommended to run the checksum program from a secluded host because the rootkit hopefully won't have had a chance to get at the checksum program on the secluded host. View that host as the ultimate secured host in a good rsync backup strategy, the CA in a good PKI strategy, etc...

          It used to be common practice in the old days to take periodic checksums to detect intrusion into systems. Now, with all the fancy IDS solutions around, it seems to be less used but I do not see anything that really replaces it yet.

          • by ls671 (1122017)

            Easy analogy: In spy movies, they put a tiny piece of something between the door frame and the door when leaving. If not there when back, then you have an intruder.

            Same basic principle.

          • You are missing the point of the checksum. In a situation where random error or corruption might happen, then sure CRC is fine because it will most likely catch the change. In an adversarial setting, if someone changes any of your file maliciously they can EASILY craft it so it matches the old checksum. That is the point of cryptographic hashes, it is very difficult to find two things that hash to the same value. If you realize that, then you have to use at least SHA-1 because it is relatively easy righ
      • by the_B0fh (208483)

        bah. 15 years ago, there was a post on BugTraq about this internet mapping that someone was doing. The systems were running redhat, everything was locked down, tripwired, only thing running was ssh, and it required certs to get in.

        The guy felt something was wrong. Compared tripwired hashes to what was on the disk. Everything looked good. lsmod, ps -ef, netstat -an, everything looked kosher.

        A couple of days later, he decided to take the system down anyway, and run an offline tripwire. Found shit.

        Can yo

        • by jythie (914043)
          keyboard?
        • by tibman (623933)

          15 Years ago there were bugs in ssh?

          • Re:no (Score:5, Informative)

            by the_B0fh (208483) on Wednesday February 27, 2013 @03:17AM (#43022235) Homepage

            no. They finally tracked it down. They watched the guy come in and take over the box again. He got in and owned the box in 8 seconds.

            The hacker found an old samba server in Australia (version 0.5 or some such), took it over. Used that to remotely mount the windows desktop used by the researchers in Japan.

            Found the private cert/key on the windows box. Used that to ssh in to the linux server. Ran a zero day gnome exploit and took it over.

            After taking over the server, installed 2 kernel modules that hid itself and also trapped certain calls like the ones used by tripwire and basically returned true for all the operations for tripwire and removed itself from the modules list and the process list.

            damned cool hack, and that was 15 years ago!

      • by a_hanso (1891616)

        Exactly. Securing the data is not much use if the programs accessing that data are compromised. If the encryption program is conning you into thinking that your data has been securely encrypted, you're screwed. I'm not an expert in this area, but I don't know why this approach is not more widespread.

  • APT (Score:5, Insightful)

    by Anonymous Coward on Tuesday February 26, 2013 @09:07PM (#43020473)

    Would have been nice to define APT...

  • by Omnifarious (11933) <eric-slash AT omnifarious DOT org> on Tuesday February 26, 2013 @09:13PM (#43020527) Homepage Journal

    If you're trying to protect your big organization against foreign spies, yes. If you are a little guy who wants to communicate without having that communication be laid wide open for a large organization to see, then I think encryption is still pretty useful. Even if just because managing all those separate unique intrusions over a long period of time requires a lot more resources than just tapping into a trunk line.

  • by Anonymous Coward on Tuesday February 26, 2013 @09:18PM (#43020559)

    I have a PC that I use for all of my financial stuff, record keeping, and other critical data. I don't encrypt the hard drive. I don't even password protect files.

    You know how I do security for the PC that handles my most critical data?

    It's not plugged into the fucking Internet. That's how.

    • by masternerdguy (2468142) on Tuesday February 26, 2013 @09:33PM (#43020645)
      Have fun when Joe the Burgler takes your computer.
    • by godel_56 (1287256)

      I have a PC that I use for all of my financial stuff, record keeping, and other critical data. I don't encrypt the hard drive. I don't even password protect files.

      You know how I do security for the PC that handles my most critical data?

      It's not plugged into the fucking Internet. That's how.

      And what do you rely on if your computer gets stolen? How about if your computer suddenly craps out and you have to take it in for repair, and the repair shop has full access to all your files as soon as the power supply is fixed?

      • by cffrost (885375)

        How about if your computer suddenly craps out and you have to take it in for repair, and the repair shop has full access to all your files as soon as the power supply is fixed?

        Why would somebody (particularly somebody who posts on Slashdot) haul the entire machine to a repair shop to replace a dead PSU? Five minutes with a Phillips-head screwdriver and a replacement PSU — done.

    • by CRCulver (715279)
      If you move records from an internet-connected computer to this isolated computer via a removable drive, you may still be susceptible to attack. After all, Stuxnet and other viruses have spread this way. Viruses were already a problem for PC users long before network-connected device. And even if the computer is totally isolated from both networks and USB drives, the data can still be compromised through a TEMPEST attack (assuming you were a target for a state or especially savvy organized crime network).
    • by swilde23 (874551) on Tuesday February 26, 2013 @10:04PM (#43020833) Journal

      I think what most of the people responding to this post aren't realizing (or acknowledging) is that your security needs to be appropriate for the data it's protecting.

      If we're talking about a corporations backbone, then yeah saying "it's not connected to the internet" isn't acceptable.

      If instead we're talking about some John Doe's personal data, then you aren't going to be attacked in the same way. Keeping it on a drive that has no internet access is probably good enough.

      • by jythie (914043)
        Not only that, but people are not really taking the APT element into account. The security that is appropriate for a computer siting around that no one knows about is pretty different from the security useful for when you have a targeted attack by a motivated entity. Even if you are just some random individual, a persistant attacker would probably do things like break into your house...
  • Translation: (Score:4, Insightful)

    by gman003 (1693318) on Tuesday February 26, 2013 @09:51PM (#43020761)

    Encryption doesn't do shit if they're grabbing it before encryption or after decryption. It's not a magic security bullet. It has its uses, but now it's become easier for Eve to hack Alice and read the plaintext than to intercept and brute-force the ciphertext. And when Alice is talking to not just Bob, but Carol and Dave, well, that makes Alice a high-value target worth spending time on.

    • by dbIII (701233)
      As an example of the above, the disturbing trend of using proxies on encrypted web traffic is opening a lot of people up for a full man in the middle attack where encryption doesn't matter. When people start getting used to accepting certs for a proxy they can be more easily tricked into passing things through a rogue proxy that is put in place for harvesting credit card details or whatever. Combine it with an open wireless access point in a busy area, or spoofed SSIDs for a more targeted approach, and yo
  • Perhaps it's really just that encryption is a lesser part of the total solution, so in that respect, it's relatively less important than it used to be.

    Now get that meat off of my cyberlawn!
  • by qbitslayer (2567421) on Tuesday February 26, 2013 @10:12PM (#43020883)

    The use of encryption is only intended to provide a way for legitimate remote users to gain supervised access to the system without having to hack into it. The real culprit behind bad security is software reliability. Attackers look for and try to exploit the defects in the software. Why is software defective? Because (it's the bugs, stupid!) the Turing/Von Neumann model of computing is inherently insecure and unreliable. Why? Because timing is not an essential part of the model. I predict that this decade will see the end of the Turing madness [blogspot.com] and that the future of computing is non-algorithmic [blogspot.com]. There is no alternative and the sooner, the better.

    • I've read the links, and that is an awfully long bow to draw.

      The use of encryption is to try and limit information to those that are intended to see it.

      None of the ideas on your blogs address how to "end the Turing madness" in a way that will still allow you to post on Slashdot.

  • certificates (Score:4, Insightful)

    by manu0601 (2221348) on Tuesday February 26, 2013 @10:26PM (#43020965)

    From TFA

    One way to help shore up defenses would be to improve--or replace--the existing certificate authority infrastructure, the panelists said

    Indeed. IMO SSL public keys could be stored in DNSsec protected DNS records. That way one would only have to trust the manager of the root zone and the TLD, which would be a good improvement compared to the CA debacle.

  • Why can't you build a system to monitor and defend against attacks? Once a virus gains control of your system it is quite easy to find and remove based on file signatures (time installed,ect). If you know what you have and something changes you should be able to identify it. It would be easy to identify attacks on a network when things go outside the norm. "Well, lets see somebody opened up a bunch of ports and is transferring files to some random IP in X country that isn't on my list of recently accessed h

  • Another reason that it could become less important is if the zone becomes a patent [theregister.co.uk] minefield. Maybe math is not patentable, or shouldn't be (but even natural genes get patented) but there are enough borders around it that could be used as excuse that could be a tool to force only the use of "approved" encryption methods.
  • Governments are trying to follow all our steps over the internet, intercepting and parsing everything we do. Encrypting our communications and trying to encrypt everything is the secure method to make the Internet freedom to us all.
    • by compro01 (777531)

      What's he's saying is encryption doesn't matter if there's shit in the system grabbing your stuff before the encryption or after the decryption.

  • by Animats (122034) on Tuesday February 26, 2013 @11:28PM (#43021267) Homepage

    I suspect he's just fed up with the state of software security, which is appallingly bad. We now have patch-and-release on everything. This turns out to be a failed strategy against competent attackers.

    I used to work on secure microkernels in the 1980s. I thought that by now we'd have provably secure microkernels in ROM with a mandatory security model enforced. Systems like that have been built a few times for the three-letter agencies, but never went mainstream. Instead, we have bloated operating systems with a high churn rate, and far too much trusted software per system.

    Ballmer used to call this "strategic complexity". As Ballmer once put it, when asked why Microsoft kept adding functions to Windows, "If we stopped adding functions to Windows, it would become a commodity, like a BIOS. And Microsoft is not in the BIOS business".

    Most applications should be running with far less privileges than they have. But if they are locked down properly, their ad tracking, update checking, and self-modification won't work. The user would actually be in charge.

    Cryptography only provides a secure way to communicate between secure regions. If there are few or no secure regions, it doesn't help much.

  • Upon reflection, and not surprisingly, the expert has made a good point.

    If due to an Advanced Persistent Threat (APT), your secret data was captured after it was decoded (as it must be to be actively used, or created, or transferred, at some point) or if the private keys are compromised (either due to torture, pressure on appropriate authorities, or captured as created (see above)) the benefit(s) of encryption are greatly reduced (even if the cryptosystem itself is very secure).

    It is a bit of a chilling tho

  • I do not agree! (Score:5, Insightful)

    by endus (698588) on Wednesday February 27, 2013 @12:21AM (#43021493)

    I was just having a discussion about this at work today. Encryption should be ubiquitous now. There is no excuse. It's not "free" in terms of the resources it takes up, but it's pretty close. Everything should be encrypted in transit. Everything should be encrypted at rest. "Well you mean the table with the PII and not...." NO! I mean EVERYTHING. The servers drive should be encrypted. The entire database should be encrypted. Every network connection should be encrypted.

    This doesn't mean encryption is a panacea solution to APTs or to any other security threat, but its an absolutely critical layer which is still not widely implemented enough. To prevent tampering, to prevent certain types of attacks, to prevent breaches through physical theft, etc. Saying encryption isn't as important anymore is like saying that keyboards aren't that important anymore. Sure, management shouldn't spend a lot of time worrying about them, and should be focusing on other problems instead....but that doesn't mean everything will be cool if everyone's keyboard is stolen overnight.

    It needs to be there, and by there I mean everywhere. And its not. Every day developers are looking at security guys like, "huh??" because they are looking for encryption to be incorporated into the product. Or, they want to "just get the system built out" without encryption, but they'll totally enable it once everything is working perfectly and all the testing is done (FYI developers, security guys aren't falling for that, we realize that you really mean, 'we'll think about enabling it until we realize how many things it will break, and then we'll ship the product without it, ignoring the enormous liability it creates'). You would think things would be different now that its 2013...they are different, but not that much different. Security still isn't regarded as a core piece, or even an important feature, of most products.

    • Re:I do not agree! (Score:4, Insightful)

      by DigitAl56K (805623) on Wednesday February 27, 2013 @02:57AM (#43022181)

      I am a proponent for more widescale use of encryption, but I am against braindead application of it as you seem to advocate for. As has been called out time and time again, the application of the encryption is critically important to it fulfilling its role. It's easy to get it so wrong in practice that all you provide your users is a false sense of security that encourages them to put more highly sensitive material than they otherwise would have at risk. Then there are other considerations. Once you bring encryption into the fold on every single aspect of your product, how easy is it to test and debug? Is your time to market now twice as long because you have to develop special QA tools rath than use something off the shelf? Is the data actually sensitive? What are these "tampering" and "certain types of attacks" that this encryption is going to protect against? Do you and your team actually understand what they are? If you don't, how do you know the encryption scheme you're using protects against them? What about export restrictions? Where does your product need to be distributed? Does your encryption help at all if your servers are rooted, since they can presumably decrypt all the data anyway? Is the encryption giving you a false sense of security around your customers data? If everything your product does is encrypted, are your customers going to be happy about their ability to implement compatible products? How can customers trust and validate your product if they can't see how it works?

      "Encrypt everything" isn't a very well thought-out plan.

      • by endus (698588)

        "Getting it wrong" in the implementation stage is a function of developers not viewing security as part of their job. I'm not saying that we can eliminate mistakes and develop perfect code every time, but you have to try. The more experience developers get with implementing it and the more universal it becomes, the fewer mistakes they're going to make when implementing it. Right now, it seems to be regarded as a novelty by most developers.

        I don't buy the "false sense of security" argument at all, sorry.

    • Re:I do not agree! (Score:4, Insightful)

      by FireFury03 (653718) <slashdot@nexus[ ]org ['uk.' in gap]> on Wednesday February 27, 2013 @05:32AM (#43022559) Homepage

      The servers drive should be encrypted. The entire database should be encrypted.

      Its not so simple. A server requires the drive to be mounted (and therefore decryptable) in order to function. So from the time the server is powered up until the time it is powered down, the file system is vulnerable - for a server that is powered on all the time, this means the window of opportunity for an attacker is huge. It stops a burglar from getting at the data after they unplug the machine and walk off with it, but if your server is in a secure datacentre then this isn't such a big worry - the concern is the server being compromised *while its running* (and as mentioned, servers tend to be running all the time). This could happen either by a remote attack, or by someone physically accessing the machine. So really, for an always-on server there often isn't a lot of point in encrypting it. Add to this that, unless you want to store the encryption key on the server itself (which rather defeats the point), you need to manually load it every time the server boots, which isn't great for failure resilliance.

      And especially for I/O heavy servers (frequently the case with big DB servers), encryption is *not* free - it can have a significant performance hit.

      There are places where filesystem encyption on servers makes sense - this is where the encrypted filesystems are only mounted for brief periods of time. For example, a server that is performing remote backups of another server can retrieve the key from the machine its backing up, mount the FS, do the backup, unmount the FS and wipe the local copy of the key. The window of opportunity for an attacker is relatively short there (the duration of the backup); although obviously if an attacker can load persistent malware on the machine they can have it capture the key when the backup next starts.

      • by endus (698588)

        I get the point about the I/O heavy servers.

        I don't agree with the always on server argument, though. Yes, it's not going to protect against many types of attacks, but it will protect against some and that is what's important. It's another layer. More importantly, it's a layer that is being increasingly asked for by customers whether or not any of us think it makes sense for a particular application. Building encryption in after the fact is an absolute nightmare and usually the costs and impact to produ

    • Re: (Score:3, Informative)

      by indymike (1604847)
      Security isn't a "core piece" because it is a pain in the ass for everyone but security people and easy to defeat most of the time. If you get root, collecting keys and salt for secure hashes becomes a lot easier. A good example is DRM - almost every crack comes from extracting keys. Most of the time, when you think encryption, your time would be better invested in say, keeping software up to date, auditing user permissions and doing other basic things that actually do have a big impact on real security.
  • by Eraesr (1629799) on Wednesday February 27, 2013 @08:37AM (#43023167) Homepage
    If there's some elite group of hackers who like to target high profile websites and services that can get past the most complex forms of encryption, then does that automatically mean we shouldn't use encryption anymore? For all I know, at the very least, encryption will keep out the 13 year old bedroom hackers who write vbscripts and call it a virus.

    Similar to me having MAC filtering enabled on my wireless router. I know MAC filtering won't keep out the determined hacker, but it will be enough of a blockade for some wannabe punk that thinks it's cool to spend a weekend trying to access insecure wifi routers. To keep out more advanced and experienced intruders, more is needed, but that's no reason for me to just open the gate to every laptop owner with half a braincell who bookmarked a "hacking 101" tutorial.
  • by opscure (1099393) on Wednesday February 27, 2013 @09:41AM (#43023521) Homepage
    The larger problem here is motivation of software developers, white hats, and black hats. The developers; whether it be open source or proprietary, tend to code towards a particular functionality and usually with deadlines. The white hats are preforming a job function to the best of their ability usually no more than 40-50 hours a week in teams. Whereas, the black hat is playing a game or solving a puzzle for personal enjoyment reasons. Now, I'm not saying that there is any weakness to any of the aforementioned groups, but when people do things for enjoyment, it tends to yield a higher chance of success especially when the black hat needs only to find a single point of attack in a system that largely extends from the digital realm or job functions of the software developer or the infosec ops.

Whenever people agree with me, I always think I must be wrong. - Oscar Wilde

Working...