Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Encryption Security Supercomputing

Quantum Computing Not an Imminent Threat To Public Encryption 119

Bruce Schneier's latest blog entry points out an interesting analysis of how quantum computing will affect public encryption. The author takes a look at some of the mathematics involved with using a quantum computer to run a factoring algorithm, and makes some reasonable assumptions about the technological constraints faced by the developers of the technology. He concludes that while quantum computing could be a threat to modern encryption, it is not the dire emergency some researchers suggest.
This discussion has been archived. No new comments can be posted.

Quantum Computing Not an Imminent Threat To Public Encryption

Comments Filter:
  • Comment removed (Score:4, Informative)

    by account_deleted ( 4530225 ) on Sunday March 23, 2008 @09:42AM (#22836360)
    Comment removed based on user account deletion
    • Comment removed (Score:5, Informative)

      by account_deleted ( 4530225 ) on Sunday March 23, 2008 @09:45AM (#22836376)
      Comment removed based on user account deletion
      • I love that book. It's an absolutely great read, and just reading this one book, you get to join the elite club of people who aren't completely ignorant about cryptography (but you still don't get to actually be considered knowledgeable about it).

        Quantum computers being able to factor big numbers is the best proof I've seen that factoring is not NP complete. If it were, we could just use these futuristic quantum computers (I'm talking far-future, many thousands or even millions of qbits) to solve just abo
        • by gomiam ( 587421 ) on Sunday March 23, 2008 @12:21PM (#22837264)
          Perhaps you would like to read again what NP-complete [wikipedia.org] means: being able to quickly check (read: in polynomial time) whether a solution is right or not by using a deterministic algorithm. Quantum computers are non-deterministic, and that's why they can be used to factor large integers. "Check all periods of r so a^r=1 (mod N) at the same time" certainly isn't deterministic.

          The darned things would be like oracles, just ask them any super hard question, like how to prove Fermat's Last Theorem, and they'd just spit out the answer. The things would be like talking directly to God. Is that even remotely possible? I don't think so. Factoring numbers is just not as hard as any NP complete problem.

          You might as well conclude that grass is purple, for all the sense that paragraph makes.

          • by cnettel ( 836611 )
            If you can check a single solution in polynomial time (i.e. the problem is NP-complete), and have a computing device performing the equivalent of testing all (the exponential number of) combinations at the same time, then you can solve an NP-complete problem. It is "just" a matter of encoding the process of checking an individual solution in a manner that is compatible with the quantum realization used.
            • Re: (Score:3, Informative)

              Actually, quantum computers don't work quite like that. The result of a quantum computer computation is the superposition of all the possible results. At the end, you have to collapse the wave function, and you wind up with a specific result, randomly from all the possible results. To factor large integers, you have to rotate the cubits 90 degrees after doing the computation, and somehow that's similar to doing an FFT. If you run the computation many times, statistically you get results bunched into clu
              • Re: (Score:3, Interesting)

                I think the other posters don't seem to understand the difference between NP and NP-complete. NP complete means that in addition to the answer being verifiable in polynomial time, solving *any* NP-complete problem in polynomial time would provide a polynomial-time solution to all other NP-complete problems. Factoring is in NP, but nobody has yet proven that it is NP-complete. Thus, traveling salesmen and knapsack packers won't be putting quantum computers on their wish lists...

                While solving NP-complete i

          • by Fëanáro ( 130986 ) on Sunday March 23, 2008 @01:48PM (#22837784)
            I think you are mistaken. It has been a while, but I remember NP like this:

            What you described is the property "NP-hard".
            For a problem to qualify as NP-complete, it is also neccessary that an algorithm that can solve this problem can also be used to solve every other NP-hard problem, with only an additional transformation of its input and output in polynomial time.

            Prime factoring is not NP-complete. There is as far as I know no transformation for the input and output of a prime-factoring algorithm, that would allow it to solve other np-hard problems as well.

            If prime factoring was np-complete, then since a quantum algorithm is known for it, it would be certain that a quantum computer could also solve all other np-hard problems.

            As far as I know, no quantum algorithm with polynomial time has been found for any NP-complete problem. So we do not know whether a quantum computer could do this
            • Parent post is absolutely correct; grandparent is absolutely wrong.

              Read Scott Aaronson's blog [scottaaronson.com] to get a clue about quantum computing.

              Also read about Schor's algorithm, which is the known algorithm to factor large numbers in log(n) time *if your quantum computer has enough entangled qubits to represent the number*. Again, though, remember that FACTORING IS NOT NP COMPLETE, only NP hard. Other NP hard problems are harder than factoring (for example, any NP complete problem ;-).

              Also read about Grover's algori
              • Re: (Score:3, Informative)

                by sydneyfong ( 410107 )

                Again, though, remember that FACTORING IS NOT NP COMPLETE, only NP hard.

                Wrong (emphasis mine). Factoring is in NP but not (known to be) NP complete. Which means that factoring is not (known to be) NP-Hard. (The only NP-hard problems in NP are NP-Complete problems) A more colloquial way to explain it is that NP-hard problems are at least as hard as NP-complete problems, yet factoring is "easier" than NP-complete problems.

                Also read about Grover's algorithm, which is a general algorithm to solve NP complete problems, and which HAS BEEN PROVEN TO BE THE FASTEST way to solve the NP complete problem of lookup in an unordered dictionary. Grover's algorithm finds the answer in n^1/2. Obviously if the fastest algorithm to solve a specific NP complete problem is n^1/2, you cannot have a way to solve all NP complete problems in log(n).

                Do you even know what you are talking about? Grover's algorithm is not used to solve NP complete problems!! You've already said it -- it's a way to look up an

                • and you'll see that the GP is speaking beyond his knowledge.


                  Can't we all just agree that it's reeeaaally complicated?
                • by wurp ( 51446 )

                  Wrong (emphasis mine). Factoring is in NP but not (known to be) NP complete. Which means that factoring is not (known to be) NP-Hard. (The only NP-hard problems in NP are NP-Complete problems) A more colloquial way to explain it is that NP-hard problems are at least as hard as NP-complete problems, yet factoring is "easier" than NP-complete problems.

                  You are absolutely right. I made two mistakes - first I said NP hard when I just meant "in NP". Second, I was just totally wrong - I thought factoring was kno

                  • I thought factoring was known not to be NP complete.

                    To be fair I sometimes make this mistake too :)

                    You may be right that unordered dictionary lookup is not an NP complete problem. I can't find any references for it either way, and certainly don't know a proof myself.

                    I think that "unordered dictionary lookup" simply refers to a typical linear search, like how you normally loop through an array to find a specific element. The run time complexity is O(n) which is linear (this is obvious), and hence the problem is in P. It is therefore NP-complete iff P = NP (which is speculated to be highly unlikely). Not a "proof that it isn't NPC" per se, but that's as close as you could get to a proof.

                    The way it can be "used" to solve NP

                    • by wurp ( 51446 )

                      The run time complexity is O(n) which is linear (this is obvious), and hence the problem is in P.

                      I believe algorithmic complexity is rated based on the size of the input, not the size of the search space (obviously, since search space is a concept specific to this problem).

                      If you add one bit to the input of a dictionary lookup, it implies that you would on average go through twice as many entries to find the key, so the complexity of unordered dictionary lookup is exponential on the size of the input.

                      I reco

                    • by wurp ( 51446 )
                      Thinking more about it, in the case of unordered dictionary lookup, it makes more sense to consider the dictionary to be an input, which brings you back to the O(n) time you listed. If you consider using Grover's algorithm to invert a function, it is ~O(2^n).
                    • by wurp ( 51446 )

                      and n is 1000, Grover's algorithm reduces the complexity from 2^80

                      er, of course I meant "and n is 80"

                      I was revising the post to find numbers that demonstrated my point, and failed to update a reference...
          • I understand what NP-complete means... In this case it means we can use traditional computers to convert any NP-complete problem into a factoring problem in polynomial time, assuming we can prove that factoring is NP-complete (which it is almost certainly not). Then, since quantum computers can solve the factoring problem, we'd get the answer to any NP-complete problem we pose. There are problems harder than NP-complete, but many common ones, including large classes of automated theorem proving, are NP-co
            • by gomiam ( 587421 )
              I consider this a contradiction... the existence of such a computer violates my sense of reality. It is far easier to believe that factoring is indeed not quite as hard as NP-complete problems.

              Got a couple of spares? Your sense of reality seems to be a bit faulty. We already have seen those cut-offs before: they happened when someone decided a computer was a good enough tool to design chips instead of using pencil and paper. Anyway, this is not a mater of something being "easier to believe": either factor

          • Perhaps you would? You've post a link to wikipedia as an appeal to authority but you've misquoted the page and butchered your definition of NP-c. The only reason that you were modded insightful is that the average slashdotter with modpoints doesn't understand the difference.

            It's not hard to follow; if I give you a candidate solution to a problem and you can check quickly if it is true or not (i.e. in poly-time) then the problem lies in NP. So all problems in P lie within NP. Some problems in NP are harder t
          • I am just curious - why can't we use the new quantum computing technology to make even better encryption algorithms that are in turn hard to break with quantum computers?
          • by colmore ( 56499 )
            The factorization of large positive integers is currently faster-than-exponential and slower than polynomial. Quantum computers can *non-probabalistically* factor large integers in polynomial time (something roughly n^3 where n=number of binary digits if memory serves me) it is generally accepted, though not proven that integer factorization is not in the complexity classes of P or NP. Which isn't weird, nobody claims that every problem out there must be P or NP. There are algorithms out there for most n
    • Re: (Score:1, Informative)

      by Anonymous Coward
      I kinda disappoint when I read the article, specifically the part that he claim that we need 5 trillion gate to implement Shor's algorithm. Quite surprise that Mordaxus can not differentiate between a program and a hardware that run a program.

      The name "quantum gate" is quite misleading, and that may be the reason why Mordakus mis-understand. It's actually means quantum operation. Shor's algorithm requires 72k^3 quantum gates, is means requites 72k^3 quantum operations. It does not means we need to build a q
    • by letsief ( 1053922 ) on Sunday March 23, 2008 @10:32AM (#22836614)
      Bruce didn't actually write that article. He only linked to it on his blog, which isn't particularly relevant. And, although Bruce is a brilliant cryptographer, he doesn't know squat about quantum computers, nor does the person that wrote that article. One of the most glaring errors is corrected in comment posted on the article page. Besides that, his argument isn't completely sound. The biggest problem with quantum computers isn't managing to build one with a tons of quantum gates, it's getting the error rate down on the components. If you do that, you ought to be able to build as many gates as you want with enough effort and money. The author's argument seems akin to saying we couldn't possibly build a 100-billion transistor count processor today. We could, its just going to be very expensive and you're not going to mass-produce it.

      Right now a lot of people working in the field say quantum computers are about 40 years off. The scary thing though is how its likely to play out. For a few decades quantum computers will likely remain "40 years off" (in the fusion sense), but then someone is going to figure out how to get the error rates below threshold, and then quantum computers will be only 10 years away. That doesn't give us much time to stop using our favorite public key algorithms. That's too bad for nTru; (they have a public key system that is likely resistant to quantum computers), their patents will be long expired.

      • Re: (Score:3, Informative)

        Please mod parent up. He makes two excellent points: 1st that Schneier did not write the essay and basically has nothing to do with it. And 2nd, that the essay is completely wrong, as pointed out by the 1st comment replying on the essay page. Shor's algorithm will not take 72*k^3 qubits or gates, it takes about k qubits and then goes through O(k^3) steps to get the answer. Everything about the posting is wrong.

      • [of which I know nothing]

        It all depends on how long you need the stuff secret.

        If all public key crypto except for nTru's is bust in possibly 25 years, and your stuff needs to be secret for 50, then you better be using quantum-resistant (did I just invent that term?) crypto Right Now(tm).

    • by wwphx ( 225607 )
      It should be noted that this is not Schneier's writing, this is just a blog post that he posted about in his blog. He may find it interesting and noteworthy, but it is not his material.
    • Out of curiosity, what do you think is wrong with Schneier's work in security consulting?
    • by 0ptix ( 649734 )
      actually thats not completely true. in the security world he might be highly regarded but in the crypto world things are... different. look at a list of his publications [uni-trier.de]. Most all are at security conferences. For that matter since 2005 he has published one paper. He has only ever published 2 papers at Crypto, 1 at Eurocrypt and none at TCC. (These being the most selective conferences in the crypto community.) Nor does he have any papers at PKC, CHES, RSA, SCC, Asiacrypt, CSN, and only one paper at Fincial C
    • by dsmall ( 933970 )
      Events happen at strange speeds.

      That's the difficulty with predicting them.

      I appreciate Bruce's attempt to predict when the X-Box's encryption (4096-bit) will be cracked. But a looksee at history shows that events happen in a funny, nearly fractal manner. Perhaps the series "Connections" would be a better description of how tech has spun up.

      Here's a "predict the future" example:

      The late 1938's brought research into the incredibly odd element uranium ... and the even odder result it gave when bombarded by sl
  • by pedantic bore ( 740196 ) on Sunday March 23, 2008 @09:42AM (#22836362)

    ... more like guaranteed employment for security experts everywhere!

    The day PKIs that use factoring or discrete logs become easy to crack is the day when there's going to be a lot of tremendous amount of money spent on stop-gap security measures until someone figures out something new...

    • Comment removed based on user account deletion
      • by owlstead ( 636356 ) on Sunday March 23, 2008 @10:16AM (#22836530)

        The day PKIs that use factoring or discrete logs become easy to crack is the day when there's going to be a lot of tremendous amount of money spent on stop-gap security measures until someone figures out something new...

        I imagine one-time pads will come back in style.

        One time pads are replacements for symmetric encryption (both sides use the same key), not asymmetric encryption. You cannot authenticate a server to multiple clients using one time pads for instance. Everybody would have the one time pad, so everybody could pose as the server. Anyway, there *are* asymmetric algorithms that should be safe against crypto analysis using quantum computing. There is no need to go distributing Blu-Ray disks filled with random valued bits (one disk per application and user) just yet.

        • Re: (Score:3, Informative)

          by letsief ( 1053922 )
          There's certainly no reason to go back to one-time pads. Basically all of the symmetric encryption algorithms are (mostly) quantum resistant. But, you do get a square root speed-up for attacking symmetric systems by using Grover's algorithm on a quantum computer. So, if you want to make sure you're still safe, you have to double your key length. That's not so bad, and certainly much better than using one-time pads. And, as you said, there are asymmetric algorithms that should be resistant to quantum co
        • I am crypto-tech naive. What's the problem with using one-time pads and having every "relationship" share a pad? (ie: When I open an account with my bank they give me a key-fob with my one time pad on it. When I log into the bank website it uses the pad to authenticate me.) Of course, setting up a relationship will be a bit more expensive...
          • The problem is all the legacy data drifting around out there on old machines that use crypto systems that can be trivially broken. If you can find a dump of old hard drives that weren't very carefully wiped because "all the data was encrypted," you might have a gold mine.
          • Re: (Score:3, Insightful)

            by owlstead ( 636356 )
            "I am crypto-tech naive. What's the problem with using one-time pads and having every "relationship" share a pad?"

            Well, just think about all the SSL enabled sites out there, and remember you will now have a N * N (client * server) number of relations that need to setup a symmetric key (which is what a one time pad is, basically). Also note that you don't have a certificate infrastructure, so you cannot just go to VeriSign or any other trusted third party and buy a certificate from there. You cannot download
            • Re: (Score:3, Informative)

              by blueg3 ( 192743 )
              No, this is why you don't use secret key system like 3DES or AES. Essentially, the number of keys you need to distribute scales with the number of pairs of communicating parties. If you have N parties all wanting to communicate with each other, that's O(N^2) keys. This is fairly unlikely, though. If you have, say, S servers (banks, etc.) and C clients, it's more like O(S*C) -- though a client-server pair may separate keys for different tasks.

              One-time pads cannot be reused. That's why they're called one-time
          • Re: (Score:2, Informative)

            by TheRaven64 ( 641858 )
            One time pads are basically useless in this scenario. In order for a OTP to be useful for data transfer, you need to exchange a pad of the same size as the message via some secure mechanism. If you can exchange the key securely, then you may as well exchange the data using that mechanism instead. The only time it is useful is for things like military or diplomatic use where the sender and receiver are in the same physical location for a while (e.g. an ambassador before he goes to the embassy) and can sec
            • by pyite ( 140350 )
              I have no idea why the parent to this post is moderated as troll. It's completely on topic and merely points out the harsh reality of trying to use a one time pad in real life.

          • How are you going to get the pads to the other person? Encrypt them?

            If you have a way of transmitting the pads securely then you could just use the same system to transmit the messages - no encryption needed!

            • If you have a way of transmitting the pads securely then you could just use the same system to transmit the messages - no encryption needed!

              Well, you could, but then you might have to wait!

              You can securely exchange OTP when you're physically in the same room with someone, and then use it when you're not in the same room. This seems like a very reasonable thing to do with people that you know and regularly meet in Real Life (e.g. your wife).

              That case obviously doesn't apply with most of the people you

          • Re: (Score:3, Informative)

            by mrmeval ( 662166 )
            Every bit of traffic needs a bit from your one time pad. That limits how long the pad lasts. Yes there are storage solutions that store gigabytes but if you're using it for a lot of data it's used up fast. An OTP has to be one time use or you might as well use billboards.

            Both parties need the same pad. You need to be able to ship that pad to them or hand it to them and be sure no one snooped or snoops the OTP. If the pad is compromised how do you inform the other party it has been tainted? Unless you go tal
        • Do you have an example of asymmetric encryption that is secure against quantum algorithm? PK is broken by Schor's algorithm (if you have a big enough quantum computer), and per Scott Aaronson [scottaaronson.com] (an algorithmic complexity professor at MIT), elliptical encryption is also demonstrated to be vulnerable. He appeared to imply in that same discussion that all asymmetric algorithms were vulnerable...
          • by Zaxor ( 603485 )
            Asymmetric cryptosystems have been developed based on lattice problems, a class of problems for which there is as of yet no known quantum algorithm. I'm not up on the relative merits, but there ARE still public key systems that are resistant to quantum computing, for now.
      • Re: (Score:3, Informative)

        Public key crypto solves the main key distribution problem of symmetric crypto. One time pads have the worst key distribution issues of all crypto! So, no, one time pads won't be making any kind of come back due to this.
    • Re: (Score:1, Informative)

      by Anonymous Coward
      If it gets easy to break because of quantum encryption, then no problem! There already exists a quantum version of public key encryption. What this means is that in ~30 years, every computer will need, at the very least, a quantum co-processor. No need to panic.
      http://www.iacr.org/archive/crypto2000/18800147/18800147.pdf [iacr.org]
  • quantum computers fail at NP-hard problems. But no one has made a cryptosystem for which breaking is NP-hard. So the eventual transition is going to be a bit tricky.
    • Re: (Score:3, Interesting)

      by snarkh ( 118018 )

      As far as I know, it is not known whether quantum computers can solve NP-hard problems in polynomial time. To say that they fail at NP-problems may be premature.
      • by russotto ( 537200 ) on Sunday March 23, 2008 @11:02AM (#22836786) Journal

        As far as I know, it is not known whether quantum computers can solve NP-hard problems in polynomial time. To say that they fail at NP-problems may be premature.
        Seeing as it hasn't even been proven that P != NP for ordinary computers, it's very premature.
        • by snarkh ( 118018 )

          Well, it is conceivable that NP is solvable using quantum computers in poly time, while pretty much everyone believes that P!=NP for ordinary computors.
        • Wrong. The definition of NP is the class of problems that a nondeterministic Turing machine can solve in polynomial time. P = NP if and only if a deterministic Turning machine can also solve those problems in polynomial time. A quantum computer is NOT a deterministic Turing machine, so P does not have to equal NP for quantum computers to solve NP problems fast.
          • by 26199 ( 577806 ) *

            If quantum computers can't solve problems in NP quickly, then presumably it follows that normal computers can't either.

            This would prove P != NP, which hasn't been done. Ergo, it can't have been proven that quantum computers can't solve problems in NP quickly.

      • Re: (Score:3, Interesting)

        by Watson Ladd ( 955755 )
        I mispoke. No current algorithms are known for solving NP problems fast with a quantum machine, and it is suspected none exist.But no proof of the converse exists. However, using a nonlinear operator permits NP complete problems to be solved in polynomial time with small error. So it's more pessimistic then I thought.
        • by snarkh ( 118018 )

          I am not sure what you mean by a non-linear operator. Certainly there are numerous ways of approximating various NP problems with different degree of accuracy.
    • nTru has signing and encryption algorithms based on the shortest vector problem, which I believe is NP-hard. I don't know if they have a reductionist proof, or if its just based on SVP like RSA is based on factoring. But, they're probably the way to go if someone were to develop a working quantum computer tomorrow.
  • the fear (Score:2, Insightful)

    by Deanalator ( 806515 )
    I think the fear is that by the time we get around to having decent PKI for stuff like credit cards etc, quantum computing will bust everything wide open. PKI is the only practical method of identity management these days, and while algorithms in the PKI are being tweaked, they are all pretty much based on the same principals, which quantum computing is a real threat too.
  • "polynomial time" (Score:4, Informative)

    by l2718 ( 514756 ) on Sunday March 23, 2008 @10:22AM (#22836566)
    This calculation illustrates a good point about the difference between asymptotic analysis of algorithms and real-world implementation of the same algorithsm. Computer science defines "efficient" as "bounded polynomially in terms of the input size". In practice, even if polynomial has a small degree (like a cubic) it already means that the resource rquirements are very large. Theory and practice are only the same in theory.
    • Re: (Score:1, Informative)

      by Anonymous Coward
      So, you have two algorithms to perform a task. One is n^3, the other is 2^n; the first takes twice the space of the second. If the input size is n > 11, which do you probably want?
  • by SIGFPE ( 97527 ) on Sunday March 23, 2008 @10:26AM (#22836586) Homepage
    ...the rate of increase of power of quantum computers isn't faster than Moore's law. I've written more on this here [blogspot.com].
    • Re: (Score:2, Interesting)

      by letsief ( 1053922 )
      It's way too early to make predictions based on trends. Quantum computing is in its infancy. We haven't even built anything that could/should really be called a working quantum computer (yes, I know we factored 15). We're going to see revolutionary changes to the field, not just evolutionary. So, every once in a while we're going to see great leaps forward, followed by a period where people just improve upon that idea. Its going to take a lot of revolutionary ideas to get a practical quantum computer,
      • Re: (Score:3, Informative)

        by ScrewMaster ( 602015 )
        I tend to agree. When promising new technologies take too long to develop, by the time they become practical they are often supplanted by something else. We will have both, since our civilization's need for both more energy and more processing power is going continue unabated for a long time.

        We may never have either nuclear fusion or quantum computing, as currently envisioned. As you say, it's impossible to predict. All we can say with some assurance is that we'll probably figure out something that will
      • by SIGFPE ( 97527 )
        Oh no, I wasn't trying to extrapolate based on a trend. I was trying to predict the trend before it even happens using basic knowledge of quantum computing.
  • Well, even if it were to be true that quantum computing algos can break the existing encryption algos, I think it is being paranoid for two reasons: (1) Quantum computing is not yet mainstream and I doubt if mischief mongers (aka thieves who wish to break into financial sustems) have the wherewithal today to work with quantum computing algos, and (2) By the time QC moves to anywhere near mainstream, I have little doubt that encryption methods would have handsomely moved along too.
    • (1) Quantum computing is not yet mainstream and I doubt if mischief mongers (aka thieves who wish to break into financial sustems) have the wherewithal today to work with quantum computing algos
      Unfortunately, as Aesop put it, "We hang the petty thieves and appoint the great ones to public office." And the highest people on that totem pole have access to the resources of organizations like the NSA, who are on the bleeding edge if not already slightly ahead of it.
  • Can somebody with physics background help me? For quantum computers to work you need entanglement and Heisenberg principle.

    1. Entanglement. Is this a fact or a theory? Looking on web I found only few experiments with some possible loopholes. I found the principle hard to grok.

    2. Heisenberg principle. It mainly states that observing an object you are changing the state of the object. The Heisenberg example from wikipedia is using a photon for measuring the position of an electron and the photon is changi
    • by blueg3 ( 192743 )
      The Heisenberg uncertainty principle isn't really what you mean. Heisenburg states that if two variables (such as position and momentum) have a particular relationship, then you cannot simultaneously measure both of them to arbitrary accuracy. (It places specific limits on the accuracy to which you can measure them.)

      To make a long story short, though, both quantum entanglement and the collapse of wavefunctions due to measurement are experimentally-confirmed fact, and small quantum computers have been built.
    • by slew ( 2918 ) on Sunday March 23, 2008 @01:28PM (#22837628)
      I'm afraid you'll have to look those physics books back up.

      Although QM computers do use basic entanglement for creating superpositions, understanding Shor's algorithm (the one everyone is concerned about since it's factoring in polynomial time) is mostly just understanding QM superposition. Entanglement gives generic QM computers great parallel processing power by superposition by explaining how QM probability wave combine under superposition, but Heisenberg limits the computing power of a QM computer in a non-trivial way as well because after you collapse the wave functions by measurement you give up the parallel processing enabled by Entanglement (e.g., if you peek inside the oven, it stops working, if some of the heat leaks out of the oven even with the door closed, it doesn't work as efficiently, the oven being the QM computer).

      FWIW, Shor's algorithm essentially converts factoring into a sequence period finding exercise. You might imagine that that's something easy to do if you had a machine which given a bunch of superimposed waves with a certain modulo structure could tell you the period (hint the ones that don't modulo with a specific period with self interfere and measure as zero, where the one with that period with self-reinforce). With a QM computer you do this all in parallel with superimposed probability waves and when you measure it, the highest probability one you measure is the one that doesn't self-interfere (the ones that self-interfere has probability near zero). Basically this measurement is wave function collapse which doesn't actually depend on entanglement or heisenberg to understand (although it does require you to believe in QM wave functions and measurement operators).

      Entanglement is really a strange artifact of QM that explains probability correlations that you see in QM experiments that can't be explained classically. It's really more of an artifact of the existance of probability amplitude waves (the QM wave function) rather than an effect that directly enables the QM computer. Of course if you didn't have QM wave functions you wouldn't have a QM computer so I guess that's a chicken and egg scenario. Entanglement is like the "carburator" function of the QM computer. The QM computer uses superposition of QM wave functions to work and when you have more than one QM wave function, they get entangled when you start superimposing wave functions and the way the waves entangle helps you compute in parallel so it's important to understand how these waves entangle.

      Heisenberg's principle is a consequence of wave function collapse (measurement) which also limits the QM computer (this limiting effect is often called QM de-coherence). Heisenberg isn't required by a QM computer when it's computing, but you need to see the result somehow so when you measure the result, one of the side effects is the Heisenberg principle (although that's also a chicken-egg problem, since HP is a consequence of QM wave function collapse and w/o QM there's no superposition computing). The closest explanation I can think of is that Heisenberg's principle is the "heat" caused by friction of the QM computer. You need friction to stop the computer to read out the result, but at the same time you can't get rid of a little friction while it's running either (causing de-coherence). The side effect of this friction is heat.

      You may have a personal opinion that superposition is a "nice way of doing statistics using discrete values for covering the not so discrete results of experiments", but there is experimental evidence that your personal opinions is at odds with physical reality. As QM computers that do QM computing (including IBM's NMR experiment which implemented shor's algorithm) have already been implemented it's hard to refute that something non-classical is going on.

      It may be that in the end, QM is total malarky and there's some other weird unexpected thing going on, but there are mountains of evidence that whatever is going on, it isn't as simple as "hidden variables"
      • Thanks a lot for your response and all the other responses also. I will try to look more into books and entanglement experiments.
        My main problem stays in "continuous vs discrete" problem as you mention it. I still believe that the discrete values are just "good values" in a wave equation like "Schrödinger equation" and not really discrete. The main problem as I see it is that "Schrödinger equation" is practically unsolvable using current mathematics and finding a continuous equation is like findi
    • by Phroon ( 820247 ) on Sunday March 23, 2008 @01:37PM (#22837690) Homepage

      1. Entanglement. Is this a fact or a theory? Looking on web I found only few experiments with some possible loopholes. I found the principle hard to grok.

      2. Heisenberg principle. It mainly states that observing an object you are changing the state of the object. The Heisenberg example from wikipedia is using a photon for measuring the position of an electron and the photon is changing the position of electron. What is happening if you are using a smaller particle that is not impacting the electron so much? Are you going to change the constant? Looks mostly like a limitation from a time when the atom was considered to be indivisible.
      The Wikipedia entries are written from the perspective of a physicist, so they aren't going to be much use to laypeople.

      1. Entanglement: It is fact. If you send a photon through a certain type of non-linear crystal, two photons will emerge that are entangled quantum mechanically. To truly understand this requires some knowledge of quantum mechanics, a basic introduction to QM and entanglement can be found here [ipod.org.uk] and here [davidjarvis.ca] if you care to learn more.

      2. Heisenberg principle: You inadvertently stumbled onto the problem yourself, kinda. When trying to measure the position of the electron, you use a high energy photon and this photon. When this high energy photon interacts with the electron it alters the velocity of the electron, so you know less about the velocity of the electron. When trying to measure the velocity of the electron, you use a low energy photon. This low energy photon measures the velocity well, but it moves the electron a little bit, so you don't know its position. This issue is the essence of the Heisenberg uncertainty principle.
      • Re: (Score:3, Informative)

        by da cog ( 531643 )

        2. Heisenberg principle: You inadvertently stumbled onto the problem yourself, kinda. When trying to measure the position of the electron, you use a high energy photon and this photon. When this high energy photon interacts with the electron it alters the velocity of the electron, so you know less about the velocity of the electron. When trying to measure the velocity of the electron, you use a low energy photon. This low energy photon measures the velocity well, but it moves the electron a little bit, so y

    • 1. From what I've heard, fact. They have very rudimentary quantum computers working. We're talking 4 qubits or so, nothing fancy.

      2. The problem with talking about quantum physics is that you deal with principles quite unlike real life. Every particle is a wave (see de broglie). It can be represented by a wavefunction, the square of which is the probability of detecting it at any given point. Different energies represent different waves, but not every wavefunction is possible. Hence, only certain energies ca
    • 1. Entanglement is a fact. 2. Heisenberg's uncertainly principle is intrinsic and has nothing to do with the kinds of particles you use. Both of these things are very non-intuitive as presented by standard QM, to the point where you can't really "understand" them but just have to accept them as facts about the world that experiments confirm.

      You may be interested in Bohm's interpretation of quantum mechanics, which is much more intuitive because it deals with realistic particles travelling in a "quantum

  • Shor's Algorithm (Score:4, Interesting)

    by debrain ( 29228 ) on Sunday March 23, 2008 @11:35AM (#22836972) Journal
    Presumably the article is alluding to Shor's Alorithm [wikipedia.org], which is a a method to factorize integers which uses quantum computation to yield a worst-case complexity significantly better than any existing deterministic methods.

    If that's the case, it's probably worthwhile to discuss Pollard's Rho algorithm [wikipedia.org], which has a poorly understood worst-case complexity (as a Monte Carlo method), but has a potential average case complexity that is comparable to the quantum.
    • Re: (Score:1, Informative)

      by Anonymous Coward
      Since when is O(N^(1/4) polylog(N)) comparable to O((log N)^3)?

      Besides, if you believe that some method is faster in practice than what the theory says, there are some RSA challenges out there with big money prizes for you to win.
      • Re: (Score:2, Troll)

        by profplump ( 309017 )
        A) To the best of my knowledge is no theory that places even moderately strict bounds on the Rho method. That was the point of the original post. B) RSA has discontinued their factoring challenges.
  • wow, this is just like my comment in the last quantum computing FUD artcile. You've still got bus speeds, cache speeds, memory speeds and capacity, and even hard drive speeds. You're not gonna crack some encryption that would take a normal computer 100 billion years but you might be able to render and encode Finding Nemo a lot faster.
  • What I get out of the article is that he seems to be saying that quantum encryption technology that could effectively factor numbers of the size we use for effective public key cryptography is far enough off that it isn't going to be a problem anytime soon. But isn't that reasoning just postponing, rather than actually dealing with the problem?
  • It appears from the first comment on the post that it's likely that this post isn't really accurate. Shor's factoring algorithm is O(k) in number of qbits and O(k^3) in number of operations. This doesn't mean that the number of gates in the quantum computer is O(k^3), it means that the time it takes to execute the algorithm is O(k^3). It appears this discrepancy may be a result of not agreeing on terminology. I haven't checked this out thoroughly, but glancing at my copy of Mermin's "Quantum Computer Scienc
  • Botnets folding md5 between the 8-12 character spectrum for rainbow hacking are more of an imminent threat....between the estimated 4-10 million machines you could probably crack any unsalted hashes within a month.
  • If we can use quantum computers to decrypt..
    Why can't we use them to come up with an equally mind-blowing way of encrypting?

    I don't mean single photon secure fibre channel stuff. That seems fairly impractical to deploy to the whole internet.

    I mean, why can't some mathematical genius come up with a new encryption algorithm that you
    can only implement on a quantum computer, and which produces a cipher text so random that it
    can't be decrypted even by another quantum computer unless it knows the secret.

    Does anyo
  • by Muhammar ( 659468 ) on Sunday March 23, 2008 @12:29PM (#22837308)
    Since the Sci Am article is subscription-only, you may want to check the complete draft-version of Scott Aaronsons article for Scientific American (before editors changes) here:

    http://www.scottaaronson.com/writings/limitsqc-draft.pdf [scottaaronson.com]

    Scott posted it on his blog on 2/18, see http://www.scottaaronson.com/blog/ [scottaaronson.com]

    (The blog is often quite technical as you can expect but funny and worth following just for its non-techical bits. Circumcision and Australian models are also discussed on frequent basis.)
  • Heavier-than-air flight is impossible; a train will never go faster than a person can run or the passengers will asphyxiate; there is no reason why anyone would want a computer in the home; etc. [wilk4.com]

    It's a bad idea to discount future technological advances wholesale.
  • In what ways is this a problem? I'm not knowledgeable about cryptography. My only knowledge and understanding (due to complex math involved) is Simon Singh's Code Book.

    I would consider that if Quantum computers exist, it would pose a serious threat to security and military applications as your enemy would always be listening. I don't know if E-Commerce would grind to a halt, since governments would initially be the only ones to afford it. I would think that instead of hitting e-commerce the better thing to
  • by damburger ( 981828 ) on Sunday March 23, 2008 @01:21PM (#22837584)

    He makes an extremely cogent argument, but it is hampered by the lack of information we have about the state of the art in quantum computers.

    Domestic spying is massively popular with western governments right now, and if you think that the NSA and GCHQ aren't doing secret research into quantum computers you are out of your mind. Furthermore, it is a commandment of signals intelligence that you do not let the enemy know you have broken his code - and in this case the enemy is us. We have no idea how far along they are. We have no idea what the generational length is for the quantum computers that are certainly being developed in secret.

    Basically, this essay could be published and make just as much sense either before or after a critical breakthrough had been made by one of the aforementioned agencies and they hadn't told anyone. Thus, we have no way of knowing if we are already past that point or not.

    Given that it has already been shown that quantum computers are not infallible, would it not make sense now to start working on encryption methods designed to flummox them?

    • The way I see it is encryption was never really effective in the first place. It was always easy to attack the source at which decryption occurs. You could combine multiple encryption algorithms MANY MANY times and force the cracker to guess the pattern of decryption used? A complex pattern of encryption algorithms would form a hidden key and its obscruity would make it increasingly difficult to decrypt. Again, this wouldn't protect against attacking the source of decryption.
    • FWIW, I've heard the rule of thumb on the government (The US... I assume European governments are roughly in the same ballpark where economy allows) is about 5 years ahead of the private sector when it comes to any given technology.

      That is, so far as research goes. I think implementation lags the private sector by several years due to bureaucracy.
      • Yeah, but who do you think made up the rule of thumb ;)

        Seriously though, you can't really make that kind of prediction with such a new field of technology. It would be like trying to guess how far along the Manhattan project was as a civilian in 1942.

        Whilst interesting, this guys piece is the crypto equivalent of the Drake equation; sound maths, but it doesn't tell us anything because we have no way of knowing any of the variables.

No spitting on the Bus! Thank you, The Mgt.

Working...