Forgot your password?
typodupeerror
Encryption Security Supercomputing

Quantum Computing Not an Imminent Threat To Public Encryption 119

Posted by Soulskill
from the in-search-of-schrodinger's-catastrophe dept.
Bruce Schneier's latest blog entry points out an interesting analysis of how quantum computing will affect public encryption. The author takes a look at some of the mathematics involved with using a quantum computer to run a factoring algorithm, and makes some reasonable assumptions about the technological constraints faced by the developers of the technology. He concludes that while quantum computing could be a threat to modern encryption, it is not the dire emergency some researchers suggest.
This discussion has been archived. No new comments can be posted.

Quantum Computing Not an Imminent Threat To Public Encryption

Comments Filter:
  • by CRCulver (715279) <crculver@christopherculver.com> on Sunday March 23, 2008 @10:42AM (#22836360) Homepage
    While I'm critical of Schneier's work in general security consulting, he is still a god in the cryptography world. His book Practical Cryptography [amazon.com] is a friendly guide to encryption that doesn't assume too much knowledge of the heady math involved. Plus, the man invented Blowfish, one of the most popular and fast algorithms around.
  • by CRCulver (715279) <crculver@christopherculver.com> on Sunday March 23, 2008 @10:45AM (#22836376) Homepage
    Uff, I meant Applied Cryptography [amazon.com] . Practical Cryptography is a bit too basic an overview written with a co-author.
  • "polynomial time" (Score:4, Informative)

    by l2718 (514756) on Sunday March 23, 2008 @11:22AM (#22836566)
    This calculation illustrates a good point about the difference between asymptotic analysis of algorithms and real-world implementation of the same algorithsm. Computer science defines "efficient" as "bounded polynomially in terms of the input size". In practice, even if polynomial has a small degree (like a cubic) it already means that the resource rquirements are very large. Theory and practice are only the same in theory.
  • by SIGFPE (97527) on Sunday March 23, 2008 @11:26AM (#22836586) Homepage
    ...the rate of increase of power of quantum computers isn't faster than Moore's law. I've written more on this here [blogspot.com].
  • by Anonymous Coward on Sunday March 23, 2008 @11:32AM (#22836610)
    I kinda disappoint when I read the article, specifically the part that he claim that we need 5 trillion gate to implement Shor's algorithm. Quite surprise that Mordaxus can not differentiate between a program and a hardware that run a program.

    The name "quantum gate" is quite misleading, and that may be the reason why Mordakus mis-understand. It's actually means quantum operation. Shor's algorithm requires 72k^3 quantum gates, is means requites 72k^3 quantum operations. It does not means we need to build a quantum computer with 72k^3 gates. That more like you build one CPU, one piece of hardware that execute one program.

    What is the technical challenge is, we need to be able to build quantum computer that big enough to hold "k" qubits, and technique that apply "quantum gate" (meaning executing quantum operation). So, the Quantum CPU may
    not need 5 Trillion qubits, but much be able to apply quantum operation on k qubits. The important point is Shor's algorithm is O(k^3), a polinomail time algorithm, which is fast!

  • by Anonymous Coward on Sunday March 23, 2008 @11:45AM (#22836682)
    If it gets easy to break because of quantum encryption, then no problem! There already exists a quantum version of public key encryption. What this means is that in ~30 years, every computer will need, at the very least, a quantum co-processor. No need to panic.
    http://www.iacr.org/archive/crypto2000/18800147/18800147.pdf [iacr.org]
  • by mattpalmer1086 (707360) on Sunday March 23, 2008 @11:48AM (#22836696)
    Public key crypto solves the main key distribution problem of symmetric crypto. One time pads have the worst key distribution issues of all crypto! So, no, one time pads won't be making any kind of come back due to this.
  • by letsief (1053922) on Sunday March 23, 2008 @11:55AM (#22836740)
    There's certainly no reason to go back to one-time pads. Basically all of the symmetric encryption algorithms are (mostly) quantum resistant. But, you do get a square root speed-up for attacking symmetric systems by using Grover's algorithm on a quantum computer. So, if you want to make sure you're still safe, you have to double your key length. That's not so bad, and certainly much better than using one-time pads. And, as you said, there are asymmetric algorithms that should be resistant to quantum computers. McEliece is an early public key encryption algorithm (with sort of ridiculous key lengths) which is probably safe, although you can't do signatures with it in a reasonable way. Then, there's nTru's work, which is probably what we'd use if someone figured out how to build a quantum computer tomorrow. They have encryption and signing algorithms that are reasonably fast.
  • by russotto (537200) on Sunday March 23, 2008 @12:02PM (#22836786) Journal

    As far as I know, it is not known whether quantum computers can solve NP-hard problems in polynomial time. To say that they fail at NP-problems may be premature.
    Seeing as it hasn't even been proven that P != NP for ordinary computers, it's very premature.
  • Re:"polynomial time" (Score:1, Informative)

    by Anonymous Coward on Sunday March 23, 2008 @12:30PM (#22836932)
    So, you have two algorithms to perform a task. One is n^3, the other is 2^n; the first takes twice the space of the second. If the input size is n > 11, which do you probably want?
  • by blueg3 (192743) on Sunday March 23, 2008 @01:02PM (#22837130)
    It appears from the first comment on the post that it's likely that this post isn't really accurate. Shor's factoring algorithm is O(k) in number of qbits and O(k^3) in number of operations. This doesn't mean that the number of gates in the quantum computer is O(k^3), it means that the time it takes to execute the algorithm is O(k^3). It appears this discrepancy may be a result of not agreeing on terminology. I haven't checked this out thoroughly, but glancing at my copy of Mermin's "Quantum Computer Science" confirms that it's k^3 in time, and only k in space.

    While it's clear a quantum computer won't be breaking your RSA keys any time soon, there's a big difference between remarking that a 4096-bit key will require "billions" of qbits and the more correct claim that it will require thousands of qbits (at least 20k qbits).
  • by ScrewMaster (602015) on Sunday March 23, 2008 @01:15PM (#22837222)
    I tend to agree. When promising new technologies take too long to develop, by the time they become practical they are often supplanted by something else. We will have both, since our civilization's need for both more energy and more processing power is going continue unabated for a long time.

    We may never have either nuclear fusion or quantum computing, as currently envisioned. As you say, it's impossible to predict. All we can say with some assurance is that we'll probably figure out something that will achieve the same ends.
  • by Muhammar (659468) on Sunday March 23, 2008 @01:29PM (#22837308)
    Since the Sci Am article is subscription-only, you may want to check the complete draft-version of Scott Aaronsons article for Scientific American (before editors changes) here:

    http://www.scottaaronson.com/writings/limitsqc-draft.pdf [scottaaronson.com]

    Scott posted it on his blog on 2/18, see http://www.scottaaronson.com/blog/ [scottaaronson.com]

    (The blog is often quite technical as you can expect but funny and worth following just for its non-techical bits. Circumcision and Australian models are also discussed on frequent basis.)
  • Re:Shor's Algorithm (Score:1, Informative)

    by Anonymous Coward on Sunday March 23, 2008 @01:45PM (#22837406)
    Since when is O(N^(1/4) polylog(N)) comparable to O((log N)^3)?

    Besides, if you believe that some method is faster in practice than what the theory says, there are some RSA challenges out there with big money prizes for you to win.
  • by TheRaven64 (641858) on Sunday March 23, 2008 @02:03PM (#22837502) Journal
    One time pads are basically useless in this scenario. In order for a OTP to be useful for data transfer, you need to exchange a pad of the same size as the message via some secure mechanism. If you can exchange the key securely, then you may as well exchange the data using that mechanism instead. The only time it is useful is for things like military or diplomatic use where the sender and receiver are in the same physical location for a while (e.g. an ambassador before he goes to the embassy) and can securely exchange keys. OTPs do not give you security, they just let you move it around.
  • by blueg3 (192743) on Sunday March 23, 2008 @02:14PM (#22837546)
    No, this is why you don't use secret key system like 3DES or AES. Essentially, the number of keys you need to distribute scales with the number of pairs of communicating parties. If you have N parties all wanting to communicate with each other, that's O(N^2) keys. This is fairly unlikely, though. If you have, say, S servers (banks, etc.) and C clients, it's more like O(S*C) -- though a client-server pair may separate keys for different tasks.

    One-time pads cannot be reused. That's why they're called one-time. This means that the quantity of one-time pad data that needs to be securely shared between two communicating parties is equal to the quantity of data they want to exchange. Say a bank transaction involves communicating 1k of data. The bank needs to give you -- securely -- 1k of OTP data per transaction you'll want to make. Generally, this is inconvenient. Since real one-time pads are unbreakable, this just means the vector for attack is moved somewhere else -- most likely, how you communicate the OTP data. (If someone can impersonate you to a bank employee and get a few transactions' worth of pad, the security of a one-time pad is irrelevant. If you use an existing cryptographic algorithm to exchange one-time pad data, you're wasting time -- simply transmit your real data using this algorithm.) One-time pads have been used, but only in a few specific situations. In general, they're not useful.

    Applying a one-time pad to data is generally done by simply XORing the pad (which consists of random bits) with the data. Reusing a one-time pad is incredibly easy to break.
  • by slew (2918) on Sunday March 23, 2008 @02:28PM (#22837628)
    I'm afraid you'll have to look those physics books back up.

    Although QM computers do use basic entanglement for creating superpositions, understanding Shor's algorithm (the one everyone is concerned about since it's factoring in polynomial time) is mostly just understanding QM superposition. Entanglement gives generic QM computers great parallel processing power by superposition by explaining how QM probability wave combine under superposition, but Heisenberg limits the computing power of a QM computer in a non-trivial way as well because after you collapse the wave functions by measurement you give up the parallel processing enabled by Entanglement (e.g., if you peek inside the oven, it stops working, if some of the heat leaks out of the oven even with the door closed, it doesn't work as efficiently, the oven being the QM computer).

    FWIW, Shor's algorithm essentially converts factoring into a sequence period finding exercise. You might imagine that that's something easy to do if you had a machine which given a bunch of superimposed waves with a certain modulo structure could tell you the period (hint the ones that don't modulo with a specific period with self interfere and measure as zero, where the one with that period with self-reinforce). With a QM computer you do this all in parallel with superimposed probability waves and when you measure it, the highest probability one you measure is the one that doesn't self-interfere (the ones that self-interfere has probability near zero). Basically this measurement is wave function collapse which doesn't actually depend on entanglement or heisenberg to understand (although it does require you to believe in QM wave functions and measurement operators).

    Entanglement is really a strange artifact of QM that explains probability correlations that you see in QM experiments that can't be explained classically. It's really more of an artifact of the existance of probability amplitude waves (the QM wave function) rather than an effect that directly enables the QM computer. Of course if you didn't have QM wave functions you wouldn't have a QM computer so I guess that's a chicken and egg scenario. Entanglement is like the "carburator" function of the QM computer. The QM computer uses superposition of QM wave functions to work and when you have more than one QM wave function, they get entangled when you start superimposing wave functions and the way the waves entangle helps you compute in parallel so it's important to understand how these waves entangle.

    Heisenberg's principle is a consequence of wave function collapse (measurement) which also limits the QM computer (this limiting effect is often called QM de-coherence). Heisenberg isn't required by a QM computer when it's computing, but you need to see the result somehow so when you measure the result, one of the side effects is the Heisenberg principle (although that's also a chicken-egg problem, since HP is a consequence of QM wave function collapse and w/o QM there's no superposition computing). The closest explanation I can think of is that Heisenberg's principle is the "heat" caused by friction of the QM computer. You need friction to stop the computer to read out the result, but at the same time you can't get rid of a little friction while it's running either (causing de-coherence). The side effect of this friction is heat.

    You may have a personal opinion that superposition is a "nice way of doing statistics using discrete values for covering the not so discrete results of experiments", but there is experimental evidence that your personal opinions is at odds with physical reality. As QM computers that do QM computing (including IBM's NMR experiment which implemented shor's algorithm) have already been implemented it's hard to refute that something non-classical is going on.

    It may be that in the end, QM is total malarky and there's some other weird unexpected thing going on, but there are mountains of evidence that whatever is going on, it isn't as simple as "hidden variables"
  • by Phroon (820247) on Sunday March 23, 2008 @02:37PM (#22837690) Homepage

    1. Entanglement. Is this a fact or a theory? Looking on web I found only few experiments with some possible loopholes. I found the principle hard to grok.

    2. Heisenberg principle. It mainly states that observing an object you are changing the state of the object. The Heisenberg example from wikipedia is using a photon for measuring the position of an electron and the photon is changing the position of electron. What is happening if you are using a smaller particle that is not impacting the electron so much? Are you going to change the constant? Looks mostly like a limitation from a time when the atom was considered to be indivisible.
    The Wikipedia entries are written from the perspective of a physicist, so they aren't going to be much use to laypeople.

    1. Entanglement: It is fact. If you send a photon through a certain type of non-linear crystal, two photons will emerge that are entangled quantum mechanically. To truly understand this requires some knowledge of quantum mechanics, a basic introduction to QM and entanglement can be found here [ipod.org.uk] and here [davidjarvis.ca] if you care to learn more.

    2. Heisenberg principle: You inadvertently stumbled onto the problem yourself, kinda. When trying to measure the position of the electron, you use a high energy photon and this photon. When this high energy photon interacts with the electron it alters the velocity of the electron, so you know less about the velocity of the electron. When trying to measure the velocity of the electron, you use a low energy photon. This low energy photon measures the velocity well, but it moves the electron a little bit, so you don't know its position. This issue is the essence of the Heisenberg uncertainty principle.
  • by smilindog2000 (907665) <bill@billrocks.org> on Sunday March 23, 2008 @02:42PM (#22837724) Homepage
    Actually, quantum computers don't work quite like that. The result of a quantum computer computation is the superposition of all the possible results. At the end, you have to collapse the wave function, and you wind up with a specific result, randomly from all the possible results. To factor large integers, you have to rotate the cubits 90 degrees after doing the computation, and somehow that's similar to doing an FFT. If you run the computation many times, statistically you get results bunched into clumps, and the distance between clumps tells you what the factor is.

    There's no evidence to date that a quantum computer can be used to solve NP complete problems, and most people seem to think factoring is in a class in between P and NP-complete [scottaaronson.com].
  • by Fëanáro (130986) on Sunday March 23, 2008 @02:48PM (#22837784)
    I think you are mistaken. It has been a while, but I remember NP like this:

    What you described is the property "NP-hard".
    For a problem to qualify as NP-complete, it is also neccessary that an algorithm that can solve this problem can also be used to solve every other NP-hard problem, with only an additional transformation of its input and output in polynomial time.

    Prime factoring is not NP-complete. There is as far as I know no transformation for the input and output of a prime-factoring algorithm, that would allow it to solve other np-hard problems as well.

    If prime factoring was np-complete, then since a quantum algorithm is known for it, it would be certain that a quantum computer could also solve all other np-hard problems.

    As far as I know, no quantum algorithm with polynomial time has been found for any NP-complete problem. So we do not know whether a quantum computer could do this
  • by SiliconEntity (448450) on Sunday March 23, 2008 @04:08PM (#22838342)
    Please mod parent up. He makes two excellent points: 1st that Schneier did not write the essay and basically has nothing to do with it. And 2nd, that the essay is completely wrong, as pointed out by the 1st comment replying on the essay page. Shor's algorithm will not take 72*k^3 qubits or gates, it takes about k qubits and then goes through O(k^3) steps to get the answer. Everything about the posting is wrong.

  • by wurp (51446) on Sunday March 23, 2008 @06:10PM (#22839202) Homepage
    Parent post is absolutely correct; grandparent is absolutely wrong.

    Read Scott Aaronson's blog [scottaaronson.com] to get a clue about quantum computing.

    Also read about Schor's algorithm, which is the known algorithm to factor large numbers in log(n) time *if your quantum computer has enough entangled qubits to represent the number*. Again, though, remember that FACTORING IS NOT NP COMPLETE, only NP hard. Other NP hard problems are harder than factoring (for example, any NP complete problem ;-).

    Also read about Grover's algorithm, which is a general algorithm to solve NP complete problems, and which HAS BEEN PROVEN TO BE THE FASTEST way to solve the NP complete problem of lookup in an unordered dictionary. Grover's algorithm finds the answer in n^1/2. Obviously if the fastest algorithm to solve a specific NP complete problem is n^1/2, you cannot have a way to solve all NP complete problems in log(n).

    Look up Grover's algorithm and Schor's algorithm on Wikipedia, and you'll see that the GP is speaking beyond his knowledge.

    I should say that I also made the mistake of thinking factoring was NP complete, and made a fool of myself on Scott's blog before the many, many people more knowledgeable than I about QC on that forum corrected me.
  • by mrmeval (662166) <mrmeval&gmail,com> on Sunday March 23, 2008 @10:27PM (#22841348) Journal
    Every bit of traffic needs a bit from your one time pad. That limits how long the pad lasts. Yes there are storage solutions that store gigabytes but if you're using it for a lot of data it's used up fast. An OTP has to be one time use or you might as well use billboards.

    Both parties need the same pad. You need to be able to ship that pad to them or hand it to them and be sure no one snooped or snoops the OTP. If the pad is compromised how do you inform the other party it has been tainted? Unless you go talk to them in person, you could phone, mail or email but those can be faked. You could have a standby code word but eh...complexity

    Now you have to talk to 22,000 people. Each with a different pad. Then there are websites.

    Governments would have thousands of people to handle all those details and all they were managing were 128bit keycards (paper things way before your time). Classified couriers, locked vaults, etc etc. It was and still is insanely expensive.

  • by da cog (531643) on Sunday March 23, 2008 @11:39PM (#22841840)

    2. Heisenberg principle: You inadvertently stumbled onto the problem yourself, kinda. When trying to measure the position of the electron, you use a high energy photon and this photon. When this high energy photon interacts with the electron it alters the velocity of the electron, so you know less about the velocity of the electron. When trying to measure the velocity of the electron, you use a low energy photon. This low energy photon measures the velocity well, but it moves the electron a little bit, so you don't know its position. This issue is the essence of the Heisenberg uncertainty principle.

    Actually, the problem is more fundamental. What is really going on is that momentum = wavelength -- that is, what we perceive as momentum on a large scale is actually an average approximation of the existence of wavelength on a small scale. This is not intuitive at all; the only reason we believe it is because of countless experiments. But once you are willing to believe this, you see that it is logically impossible to know the position and momentum of a particle at the same time, even if you had the godlike power to measure the electron's properties without touching it. This is because for it to have both an exact momentum and an exact position, it would have to simultaneously be a perfectly non-localized wave and a perfectly localized point, which is nonsense.

  • by sydneyfong (410107) on Monday March 24, 2008 @07:31AM (#22843550) Homepage Journal

    Again, though, remember that FACTORING IS NOT NP COMPLETE, only NP hard.
    Wrong (emphasis mine). Factoring is in NP but not (known to be) NP complete. Which means that factoring is not (known to be) NP-Hard. (The only NP-hard problems in NP are NP-Complete problems) A more colloquial way to explain it is that NP-hard problems are at least as hard as NP-complete problems, yet factoring is "easier" than NP-complete problems.

    Also read about Grover's algorithm, which is a general algorithm to solve NP complete problems, and which HAS BEEN PROVEN TO BE THE FASTEST way to solve the NP complete problem of lookup in an unordered dictionary. Grover's algorithm finds the answer in n^1/2. Obviously if the fastest algorithm to solve a specific NP complete problem is n^1/2, you cannot have a way to solve all NP complete problems in log(n).
    Do you even know what you are talking about? Grover's algorithm is not used to solve NP complete problems!! You've already said it -- it's a way to look up an unordered dictionary. Which is not an NP complete problem AT ALL. In addition, it's nowhere proven to be the fastest way to solve NP complete problems. If you could prove an algorithm is the fastest way to solve NP complete problems, you win a million bucks [claymath.org]. Do you even know what NP complete problems look like?

    Look up Grover's algorithm and Schor's algorithm on Wikipedia, and you'll see that the GP is speaking beyond his knowledge.
    And you are too. (And I think I am too ;-p)

    ===

    I never thought I knew anything about theoretical computer science until topics like this come up and people who have no idea what they are talking about gets modded up...

"The Amiga is the only personal computer where you can run a multitasking operating system and get realtime performance, out of the box." -- Peter da Silva

Working...