Dan Simon on Quantum Crypto

| Comments (7) | COMSEC
I'm not an expert on quantum computing, but luckily EG reader Dan Simon is. the other day in the comments section he explained why he doesn't think it's very relevant. It's worth a read:

My impression is that watermarking is to signal processing what quantum computing is to theoretical physics, or cryptography to number theory: a way for a massive oversupply of researchers in a once-proud field to make a claim to relevance.

...

Basically, there is one thing that quantum computers have been found to be capable of doing much better than classical computers. That one thing has been characterized variously as "finding hidden subgroups", "solving the abelian stabilizer problem", or "finding periodicities in abelian groups". Because this one thing happens to lead to polynomial-time algorithms for integer factoring and discrete log, quantum computers have been bandied about as an incredible new computing technology, but the truth is that this one thing is really very limited in scope, and in a decade and a half, nobody's found another significant application for it.

Moreover, there are lots of (admittedly informal) reasons for believing that quantum computers can't really do anything interesting beyond this one thing. So we're left with a technology that, even if perfected*, is unlikely to be able to accomplish anything of interest beyond solving a certain narrow class of number theory problems.**

Dan goes on to observe that there are other public key algorithms not in wide use that don't appear to be vulnerable to quantum computing.

This brings us to another class of people besides quantum computing researchers with an interest in hyping the technology: people working on alternatives to factoring and discrete-log based cryptosystems. The deployment cycle of new public key algorithms is incredibly slow: to a first order, everyone outside the government is still using RSA. This means that new public key algorithms with similar "interfaces" to existing algorithms (e.g., they're interchangeable but faster or more secure, etc.) don't have much of a real-world value proposition outside of specialized niches, especially as there are a whole slew of existing algorithms with better properties based on elliptic curves, pairings, etc. But if QC actually worked, then those systems would all be broken and we'd need to reinvent them based on different problems: instant job security for cryptographers.

7 Comments

Technically speaking, using ECC *is* DH, just in a different group, we all know what you meant.

ECC started out with obvious security parameters which have remained basically unchanged - there has hardly been any new cryptanalysis of it since it was first proposed. DH modulo a prime and even moreso RSA by contrast have repeatedly had new insights into how to attack them which resulted in a need for longer key sizes. The conclusion from the crypto community has been that we understand RSA better, because we've had the most 'progress' on it, hence we have more confidence in it than the other systems. In any other engineering discipline people would draw the exact opposite conclusion.

The alternatives to RSA are pretty good. DH in ECC blows everything else out of the water in terms of speed, key, and signature sizes, and it's possible to build public key signature (but not encryption) schemes using plain old symmetric primitives, which results in longish signatures, possibly as big as 20k, but that's not an issue for many, if not most applications these days. Honestly, given the objetively awful history of trying to make a secure encoding of a protocol using RSA, I have a hard time believing that the preference for it is due to anything but branding and hype. Just about any yutz can make a secure protocol using DH, with a very small list of gotchas which can be added after the fact, while the best in the world can't seem to make an RSA based protocol which remains unbroken for a few years with any reliability.

Bram, I'm not sure I agree we haven't seen new ideas in the elliptic-curve discrete logarithm problem. What about the Menezes-Okamoto-Vanstone reduction? Or Weil descent? Or the abelian variety index calculus ideas of Gaudry and Diem?

I actually don't think the preference for RSA is a matter of hype or branding. Rather,
it's basically an issue of Installed base: it's trivial to get an RSA certificate and
an implementation which will accept it. Try
getting an EC certificate. Now try finding a client that will accept it. Obviously,
this could change, but since crypto isn't really the weak point in the system,
what's people's incentive to change? That's why we're still using PKCS#1 1.5 and
not OAEP or PSS.

Moreover, I'm not sure I agree with your assessment of the difficulty of building
protocols with RSA. It's true that there have been attacks on the use of RSA in
SSL/TLS (I'm thinking Bleichenbacher and the Kima timing analysis following here)
but there have also been attacks on the other primitives as well, including
timing analysis and Bleichenbacher's result on extracting DSA private keys
remotely if k is biased... And of course we've seen problems with even much
simpler primitives: look how hard it turns out to be to use even a secure
block cipher in CBC mode.

Hovav, there's been new math, but the security parameters have only been moved by CPU increases.

EKR, fair enough about the installed base, but I at one point went around asking people if they could give me a list of all the gotchas in encoding RSA, and not one of the people I spoke to was willing to claim to have such a list even with research. By contrast, when using DH the guidelines are basically 'use of the moduli in the RFC, and check for zero and one'. Not that I think anyone who isn't an expert should play with such things, just that even if someone is an expert it's a lot more prudent to use DH.

I'm really not a big fan of CBC mode, because it has no actual benefits over counter mode, but a lot of false apparent benefits to the inexperienced.

Bram, I think recommended parameter settings have changed together with the new math. For example, are we still using 160-bit supersingular curves?

While we're at it, why do you say that the DH-in-ECC setting yields the shortest signatures?

Hovav, ECC signature sizes completely blow aways RSA signature sizes, by an order of magnitude. Unless you believe that prudent security parameters of ECC are vastly larger than is currently accepted, it isn't a complete blowout.

The question wasn't whether DH-in-ECC signatures blow away RSA signatures; it was whether the DH-in-ECC setting yields the shortest signatures of any common setting for crypto. It does not.

As for parameter choices, here's a concrete question: For the curve E: y^2 + y = x^3 over GF(2^m), what size would you pick for the field in order to obtain 80-bit security? I myself would recommend 600 bits or so, say m=601.

Leave a comment