Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Security Encryption Japan Math Supercomputing IT

Fujitsu Cracks Next-Gen Cryptography Standard 99

judgecorp writes "Fujitsu and partners have cracked a cryptogram which used 278-digit (923 bit) pairing-based cryptography. The technology was proposed as a next-generation standard, but Fujitsu cracked it, at this level in just over 148 days using 21 personal computers." Reader Thorfinn.au adds a snippet from Fujitsu's announcement of the break: "This was an extremely challenging problem as it required several hundred times computational power compared with the previous world record of 204 digits (676 bits). We were able to overcome this problem by making good use of various new technologies, that is, a technique optimizing parameter setting that uses computer algebra, a two dimensional search algorithm extended from the linear search, and by using our efficient programing techniques to calculate a solution of an equation from a huge number of data, as well as the parallel programming technology that maximizes computer power."
This discussion has been archived. No new comments can be posted.

Fujitsu Cracks Next-Gen Cryptography Standard

Comments Filter:
  • Pretty Fast (Score:5, Insightful)

    by MyLongNickName ( 822545 ) on Tuesday June 19, 2012 @07:53AM (#40368543) Journal

    148 PCs * 21 days is around ten years of PC time. Not much in the grand scheme of things.

    • Re:Pretty Fast (Score:5, Informative)

      by SJHillman ( 1966756 ) on Tuesday June 19, 2012 @07:59AM (#40368627)

      Given a modest botnet of around 3000 hosts, this could be cracked in about a day.

      However, note that between the 21 PCs, there were 252 cores - an average of 12 cores per PC, so these desktop PCs were at least reasonably high-end even if still consumer technology.

    • by Bigby ( 659157 )

      What does it take to crack 256-bit AES encryption?

      And answer in measurements of PCs or LOCs

      • Re:Pretty Fast (Score:5, Interesting)

        by Bengie ( 1121981 ) on Tuesday June 19, 2012 @08:31AM (#40369013)
        It is estimated that AES256 would take about 2^200 operations with currently public flaws.

        Hypothetical
        1,000,000,000 computers(1bil computers)
        1,000,000,000,000,000 ops per computer(1peta op)
        1,000,000,000,000,000,000,000,000 ops per second total

        1.6069380442589902755419620923412e+60 ops to break AES256

        1.6069380442589902755419620923412e+60 / (1,000,000,000,000,000,000,000,000 * 60sec * 60min * 24hr * 365days)
        is 50,955,671,114,250,072,156,962,268,275.658 years

        You would have to be quite dedicated and live a long time to break AES with current math/computers.

        My cousin went through an advanced crypto class and his teacher ran the math and it comes down to this. If you had an ideal computer(100% efficient) that consumed the absolute minimum amount of energy that it takes to represent data based on our current laws of physics, you would have to consume all of the heat energy in the entire Milkyway Galaxy. Short of a major flaw in AES, no galaxy-bound computer can break AES.
        • Unless it uses brute-forcing and is correct on the first guess...
          Then the energy required is 'considerably less' than all the energy in the galaxy.
          On the other hand... If it is such a strong encryption, why isn't everybody using it on everything?
          Why would we continue to use Blowfish / twofish / HSA1 (with salt if it isn't linkedin)/ etc. etc.?
          • Re:Pretty Fast (Score:5, Interesting)

            by Bengie ( 1121981 ) on Tuesday June 19, 2012 @08:54AM (#40369335)
            Twofish is decently faster than AES and still quite strong(Twofish almost became AES, was in the final 5), so it is a good alternative. SHA1 is a hash, not a symmetric encryption.

            Unless it uses brute-forcing and is correct on the first guess...

            AES keys are typically randomly generated or based on a hash. AES is strong, so breaking the public key or password to get the AES key is always the best way to "break" AES, but it's really just a side-channel attack. That's not AES's fault.

            • by Anonymous Coward

              As all current x86, many ARM and other processors include AES hardware for encoding/decoding.

              • by Bengie ( 1121981 )
                I forgot about that part. With all things fair, Twofish is faster, but like you pointed out, AES is in the hardware, which makes it faster/lower-power.
          • Does everyone use randomized quick sort?
          • Why are we suing AES then? Serpent is even better. It's just slower.

            Oh, that's why everyone isn't using AES, either. Speed vs strength. Sometimes speed wins out.

        • by RoboRay ( 735839 )

          That must be why "they" want us to change to a different technique.

          • Re:Pretty Fast (Score:5, Interesting)

            by Bengie ( 1121981 ) on Tuesday June 19, 2012 @09:03AM (#40369451)
            Most of the next gen cryptography is about public keys or hashes. AES is still effective, so the weakest link in the chain is going to be passwords or breakable public keys, which would allow an attacker to acquire the AES key during the hand-shake.

            One needs a safe way to transmit the AES key over a public network, like the internet. Public keys are very slow, but semi strong. AES is quite fast and really really really strong. Trying to make asymmetric encryption strong is hard because the public key gives information about the private key.
        • by AmiMoJo ( 196126 )

          That is assuming you are going to brute force the entire key space, but usually you only need to brute force all possible passwords that could be entered up to a certain length. You might still be thwarted by a very long password, but something like 20 characters is very breakable with today's technology.

          Somewhere in a secret datacentre there is probably a big machine using GPUs to brute force passwords on AES encrypted data.

      • by Thiez ( 1281866 )

        Using brute force? On a conventional computer it would require more energy than is available in our galaxy. You can't even count up to 2^256.

    • by adisakp ( 705706 )

      148 PCs * 21 days is around ten years of PC time. Not much in the grand scheme of things.

      10 years of today's computer time. If Moore's law holds, 10 years from now, it will be 1 month (~36 days) of computer time. And in 10 more years, it will be 8.5 hours.

      The technology was proposed as a next-generation standard, which means it will take 3-5 years to be adopted as the standard. It'd be nice to have a standard that held up more than another 3-5 years.

  • by wkcole ( 644783 ) on Tuesday June 19, 2012 @07:55AM (#40368565)
    The real story is going to be how something with (apparently) severe weaknesses became anyone's pet new crypto standard.
    • Bad management decision?
    • Apparently it was an implementation weakness. The math may still be sound.
      • by wkcole ( 644783 ) on Tuesday June 19, 2012 @09:14AM (#40369565)
        Got a reference for that? The Fujitsu PR seems to say otherwise but it suffers from being a weak translation. (which raises the question: WTF is wrong with Fujitsu that they don't have anyone capable of well-written English???)
        • Its probably machine translated from Japanese. Sadly while the Japanese are doing some incredible research in very diverse areas, most of the translations are very rough and ready. If you want to criticise though, how is your Japanese? ("nihongo wa dou desu ka"?)

          • by wkcole ( 644783 )
            My Japanese is worthless, but I'm not a large multi-national industrial conglomerate with operations in Japan worth millions of dollars per year that would justify my time and/or money to actually learn Japanese or hire someone who can write a press release in Japanese fluently. If I had a need to issue press releases in Japanese, I'd at least have a native speaker read them to make sure my machine translator hadn't messed up.
            • by AmiMoJo ( 196126 )

              They probably never intended this to be a major international story. What it boils down to is that short keys are insecure, hardly a revelation. Something for the Japanese press only, with an English translation as an afterthought.

        • I work for Fujitsu, I'm a native english speaker, I've never been to Japan and can't speak/read/write a word of Japanese. However there are 175K employees so TFA is news to me too. The translation looks like it's a literal translation of a Japanese press release, as opposed to one written by someone with an understanding of the subject. As literal translations go it's a pretty good one (you should see what the auto-translator dumps in my inbox),
    • by hlavac ( 914630 )
      "Im afraid I can't do that, Dave."

      Conflict of interest. Thats what you get when you want an algorithm used by everyone so it must appear to be absolutely secure but at the same time want it to be insecure for everyone else so you can read everything. You add a subtle vulnerability of a class you believe only you know about and can exploit, then push it as a standard. Then it turns out there are other smart guys out there, oops. Or not and you hit jackpot.
      • by wkcole ( 644783 )

        Frankly, that's paranoid. I stopped trying to understand the deep math of leading-edge crypto some years ago as my brain calcified, but I understand enough of it to know that there's no need for intentional sabotage to explain vulnerabilities to innovative attack.

        My question is how *THIS* mechanism has survived as long as it has. I haven't looked at the math in depth, but the broad descriptions I've found make me expect that there must be far-better-than-brute-force attacks on it. This crack isn't the fir

    • by David Jao ( 2759 ) <djao@dominia.org> on Tuesday June 19, 2012 @04:39PM (#40376883) Homepage

      The real story is going to be how something with (apparently) severe weaknesses became anyone's pet new crypto standard.

      Oh my god, uninformed summary is uninformed. Please don't make it any worse with your (even more) uninformed comments.

      I'm a cryptography researcher specializing in pairing-based cryptography. I know this subject well. Here's the real deal. Pairing-based cryptography is just as (in)secure as RSA. Nobody goes around thinking that 923-bit RSA keys are secure. RSA is very widely used. (The current world record for an RSA break is 768 bits, but 1024 bit keys have been disrecommended for a LONG time, and there are teams working on breaking 1024-bit RSA right now that expect to succeed within a few years.) Nobody really expected 923-bit pairing keys to be secure. Those keys are too short. It's nice that these researchers did this, and it's nice that we know exactly how hard it is to break a 923-bit key, but the only take-away lesson here is that short keys are insecure. It does not mean that pairing-based cryptography has "severe weaknesses" or that the whole concept of pairing-based cryptography is somehow insecure.

      I repeat: the key broken in this study was short. The study's conclusions are not very surprising or indicative of any weakness in the underlying protocol.

      Another gross misrepresentation in your comment is the insinuation that pairing-based cryptography is somehow anyone's "pet new crypto standard." The number of international standards documents dealing with pairings is exactly zero.

  • by Urd.Yggdrasil ( 1127899 ) on Tuesday June 19, 2012 @07:56AM (#40368579)
    This article makes very little sense to me. They don't mention what the crypto algorithm was or who was pushing it as the "next gen standard". I don't know of any proposed cryptographic standard with 923 bit anything.
    • by neonsignal ( 890658 ) on Tuesday June 19, 2012 @07:59AM (#40368619)

      The Fujitsu press release [fujitsu.com] is light on detail too.

    • Another dumb statement in the article:
      "Succh a cryptanalysis would allow an attacker to counterfeit the authority of a system administrator, according to Fujitsu."

      If I had known you could counterfeit authority, I would be sitting behind a really big desk with the flag of my own country behind it...

      • by GiMP ( 10923 )

        Being somewhat (but not intimately) familiar with this cryptography methodology, what they're claiming to have done is broken the equivalent of a signing-authority key. This is worse than with a CA in PKI, however, because this key can be used for encryption and decryption, it isn't only a signature used for validation/verification.

        Essentially, Identity-based systems use a single "master key" which is used to create all the other keys, and can be used to decrypt all of the messages encrypted with those keys

      • by Nutria ( 679911 )

        So is P1363.3 an actual standard, or just a proposed standard?

        • The article summary had it correct with "was proposed as a next-generation standard"

          IEEE and potentially NIST (amongst others) were proposing it and/or looking at what applications it might be suitable for.

      • by wkcole ( 644783 )
        It is a bad sign that the IEEE paper linked from the Wikipedia page (http://grouper.ieee.org/groups/1363/IBC/material/P1363.3-D1-200805.pdf) didn't even get enough care from its authors to catch a serious typo in the title.
    • by vlm ( 69642 )

      They also carefully avoided whatever their new invention was.
      "it wasn't until this new way of approaching the problem was applied"
      Whats the new way to approach the problem? Well, there isn't one in the article.

      As a press release its almost the perfect opposite of science. Virtually unknown subject so there's little common ground for discussion, no method/experimental details beyond the most flimsy, no conclusion, no verifiable prediction, no suggestion of future work. Other than that, its a great anecdot

      • Re: (Score:3, Funny)

        by SJHillman ( 1966756 )

        As a press release its almost the perfect opposite of science.

        So, what you're saying is that Fujitsu used..... magic!

        • by jdgeorge ( 18767 )

          As a press release its almost the perfect opposite of science.

          So, what you're saying is that Fujitsu used..... magic!

          More specifically, religion. A press release is marketing, after all.

    • by vlm ( 69642 ) on Tuesday June 19, 2012 @08:32AM (#40369031)

      "I don't know of any proposed cryptographic standard with 923 bit anything."

      Ha I found it, purely by luck. First of all assume the press release went thru a journalism and PR filter so its almost entirely incorrect other than some numbers might not be incorrect.

      I remember reading a paper on implementing IDEA (which is a two decade old, semi-patent-unencumbered algo because its so old) on a Spartan FPGA, which I remember because I fool around with a spartan dev board at home and this is the kind of thing you find when you google for fpga and various crypto system names, etc. Anyway that specific FPGA implementation of IDEA has a latency of ... 923 cycles. So its not 923 bit anything, they're talking about a streaming cryptosystem that takes 923 cycles from the first bit squirts in until that encrypted first bit bit squirts out, and the journalist filter rewrote it. Thats low enough latency for high bandwidth stuff like video, but not so good for voice or keyboard ssh unless you play some games (which is a whole nother topic)

      Anyway, cracking a "mere" 128 bit sample in 148 days or whatever is still kinda interesting, even if its not cracking an entire 923 bit system. Landauer limit alone would imply they had to have cracked the algorithm not just brute forced it.

      http://www.cs.washington.edu/education/courses/cse590g/01sp/fccm00_idea1.pdf [washington.edu]

      http://en.wikipedia.org/wiki/International_Data_Encryption_Algorithm [wikipedia.org]

      • Re: (Score:3, Interesting)

        by cryptizard ( 2629853 )
        This is completely wrong. They are using a pairing based crypto system which you can think of as public key plus extra useful properties. The security of these schemes is based on the bilinear diffie Hellman assumption which is very recent and has not been thoroughly tested. It is very likely that it is still secure but at larger key sizes than previously thought.
        • by vlm ( 69642 )

          Could be, I'm just working on numerology. I can't find anything when googling for 923 bits, although I found the IDEA FPGA implementation easy enough and I remember reading it in the past too. Its not "real crypto" if no one else knows about it. Or another way to put it is everyone in the field knows that if no one knows the algo then its probably pretty insecure, so the surprise is...

          Maybe the journalist filter tried to pass 1024 bits thru a metric to english translation system resulting in 923 bits.

          • The key can be any size, just like in RSA. They likely chose 923 as a number they thought they could reasonably break. This is similar to the RSA contests where they try to find the largest semi-prime that they can factor (it is never a round number).
    • by Anonymous Coward

      They mention that this is a pairing based system. Unfortunately, the type of pairing isn't specified, and there are hundreds of pairing based schemes out there.

      For those not familiar with pairings and IBE, this quote from the fujitsu website may help:

      4 Pairing-based cryptography:

      A next-generation cryptography (proposed in 2001) based on a map called pairing, which offers many useful functionalities that could not be achieved by previous public-key cryptography. The security of pairing-based cryptography is based on the intractability of discrete logarithm problem (DLP). DLP is a problem to compute d such that a = gd for given g and a.

      5 Identity-based encryption:

      A type of public-key encryption in which the public key of a user is some unique information about the identity of the user (e.g. a user's email address). It does not require authentication of public keys unlike former public-key cryptosystems.

      The year of 2001 here presumably refers to the Boneh-Franklin [wikipedia.org] scheme (paper on springer behind paywall [springerlink.com]), which also immediately showed the usefulness of pairing crypto by creating the first efficient identity-based encryption scheme (wiki). Basically, IBE is an

    • I surmise that "pairing-based cryptography" is just some weird new name someone dreamed up for Public Key Cryptography, as that uses algorithms that work on a pair of keys. Marketing and customers are so anxious to have the next big thing that the marketing people routinely dress up old ideas in new terms, and customers eagerly latch on. Both often realize what they're doing, but they hope others, including journalists and government bureaucrats responsible for choosing and approving standards, are fooled

      • by jmsp ( 1987118 )
        It's explained in the article notes, actually:

        Glossary and Notes
        (...)
        4 Pairing-based cryptography:
                A next-generation cryptography (proposed in 2001) based on a map called pairing, which offers many useful functionalities that could not be achieved by previous public-key cryptography. The security of pairing-based cryptography is based on the intractability of discrete logarithm problem (DLP). DLP is a problem to compute d such that a = gd for given g and a
    • I don't know of any proposed cryptographic standard with 923 bit anything.

      Suite B

  • Given how long it takes for something to go from 'new' to 'common' and from 'common' to 'deprecated' and from 'deprecated' to 'finally dead, thank god'(and, for the spooks out there, the fact that storage is cheap and certain decade or decades-old messages may still be interesting...), the idea that anything only a few powers of ten away from trivial crackability was even being considered seems like a Very Bad Thing.

    252 cores is pretty tiny by the standards of a reasonably motivated attacker. Aside from
    • Or, just rent a cloud server for a little while.
    • by kasperd ( 592156 )

      Given how long it takes for something to go from 'new' to 'common' and from 'common' to 'deprecated'

      Common and deprecated are not mutually exclusive. Something can stay common long after it has become deprecated. Seeing technologies remaining common for a decade after they became deprecated is one of the main reasons for the 'thank' in the 'finally dead, and thank god for that' state that you mentioned.

    • by jd ( 1658 )

      Trivial rule of thumb: Any encryption method, to not be considered excessively weak now, must not be considered excessively weak (expected lifespan of method + time information to remain sealed afterwards) years into the future, even after Moore's Law is taken into consideration.

      Thus, if you've a cypher that you expect to use for the next 20 years to protect data that will be under a 100 year rule before disclosure, it has to be resilient to attack for 120 years. Chip performance roughly doubles every 18 mo

  • by mister2au ( 1707664 ) on Tuesday June 19, 2012 @08:19AM (#40368843)

    NICT has an arguably better press release of the same partnership - it goes in just a little detail (which is better than almost none from Fujistsu)

    http://www.nict.go.jp/en/press/2012/06/18en-1.html [nict.go.jp]

  • Pairing based cryptography is a relatively new kind of crypto that can be thought of as public-key plus some extra useful properties (makes Identity Based Encryption [wikipedia.org] possible for instance). It does not say in the article which particular scheme they are using, but one of the big ones is Boneh-Franklin [wikipedia.org]. Just as the security of RSA is based on the hardness of factoring, most pairing schemes are based on the hardness of something called the Bilinear Diffie-Hellman problem.

    It may be tempting to deride th

Every nonzero finite dimensional inner product space has an orthonormal basis. It makes sense, when you don't think about it.

Working...