On long cryptographic keys

| Comments (6) | COMSEC
The appropriate cryptographic key length for any given application is a popular topic in the communications security/cryptography community. There's a fair amount of consensus on the resistance of any particular algorithm/key-length pair to the best known attacks.
  • To a first order, the strength of symmetric encryption algorithms against the best known attacks is the length of the key itself.
  • Currently the strength of our standard digest algorithms (SHA-x) against preimage attacks is roughly that of the output length, and against collision attacks it's something less than half the output length (this is a topic of active research on SHA-1, which currently stands at 63 bits as opposed to the theoretically optimal 80 bits).
  • The situation is somewhat less clear for asymmetric algorithms: again, you can see keylength.com for details, but a 1024-bit RSA key is very approximately equivalent to an 80-bit symmetric key, a 2048-bit RSA key to 100 bits or so. Elliptic curve keys of size 2n are approximately equivalent to symmetric keys of size n.
Again, this all assumes the best known current attacks.

There is also some consensus on the near-term lifespan of any set of parameter: keylength.com has a good summary here. Whats a lot less clear is the long term. For instance, ECRYPT's recommendation is that 128-bit symmetric keys are fine for "Long term protection: Generic application-independent recommendation, protection from 2009 to 2038." NIST's recommendations aren't that different, though they recommend 192 bits for >> 2030 and 256 bits for >>> 2030, whatever that means. In general, guidelines for keys > 128 bits are somewhat vague. The basic problem is that it's very hard to imagine any circumstances in which a 128-bit symmetric key will become susceptible by brute force attack with conventional computers. Hopefully Dan Simon will chime in here, but as I understand the situation if we got quantum computers working they would potentially offer a significant improvement against the main public key algorithms and a potential square root improvement against symmetric algorithms. Of course, we're nowhere near having a working quantum computer anything like this good.

Absent working quantum computers, the only plausible avenue of attack on any system with an effective key strength of 128 bits is an analytic attack of some kind. Each asymmetric algorithm is basically the same at any key length (the keys are just bigger) so you would expect any analytic attack at (e.g., 1024 bits) to also be useful at another key size (e.g., 2048 bits). The relationship between AES-128, AES-192, and and AES-256 is more complicated, but the underlying structure is the same, so you would expect an attack on AES-128 to have an impact on AES-256 as well. Beyond that, it's very hard to predict what form such attacks would take; otherwise those doing the predicting would be hard at work on their CRYPTO submissions. In particular, it's hard to predict what the relative strength of the algorithms would be after such an attack.

This isn't to say that AES-256 isn't likely to be stronger than AES-128—most likely it is, even if there are better analytic attacks, if for no other reason than it uses more rounds—but it's not really 2128 bits stronger in any meaningful way. Either we don't have any attacks better than those we have now, in which case there's no practical attack and AES-256 is just super-duper secure (but infinity * 2128 = infinity), or there are better attacks, in which case all bets are off. Much the same is true for the asymmetric algorithms.

A related issue is "matching" the strength of the symmetric algorithms to the asymmetric algorithms. In protocols such as TLS, one usually uses an asymmetric algorithm (e.g., RSA) for key establishment and a symmetric algorithm (e.g., AES) for encryption. Some communities (the NSA especially) are very hot on choosing a set of algorithms with roughly equivalent nominal strengths (within the assumptions listed above). This can be quite problematic, however, because of the quantization of the symmetric algorithms. For instance, it's very natural to use AES-128, but if you really only care about having 80 bits of equivalent security, it's not at all clear why the fact that you're using AES-128 should drive you to use 3000 bit RSA keys; it's just that the minimum standard symmetric algorithm is AES-128. Similarly, if you're worried about cryptanalysis of AES so you've decided to use AES-256 it's not clear that that implies that you ought to be using an equivalent RSA key size (15000 bits!). Flipping things around, if you're worried about quantum computers, then the improvement for symmetric algorithms isn't the same as for asymmetric algorithms, so it's not clear why it would make sense to just dial up the strengths in parallel under either theory.

6 Comments

but as I understand the situation if we got quantum computers working they would potentially offer a significant improvement against the main public key algorithms and a potential square root improvement against symmetric algorithms.

Correct on public key. I don't know of any group-theory based public key (RSA, DSA, Diffie-Helmann, and the EC cryptos) that survives quantum computing.

I'm not sure about the symmetric key side, I think its "sqrt(N) rather than N for unstructured search", but I THINK its "non-parallelizeable", so there's no time-space tradeoff.

Minor point, but isn't half of 80 bits = 79 bits, not ~63 bits?

Sorry, the strength of a perfect hash of output length 2n against collision
is 2^{n/2} bits. However, the strength of SHA-1 is more like 2^{63} rather
than 2^{80}. Partly there is some confusion about measuring in bits
versus measuring in size, which is 2^{bits}

For what it's worth, I agree completely with your entire analysis here. (And it sounds like you share my annoyance at "effective key length equivalence" obsessives.)

The only thing I can think of to add is that if you're the sort of person who worries about advances in quantum computing, applying Grover's algorithm for quantum brute-force search to, say, AES-128 would only require keeping a couple of hundred qubits coherent, whereas applying Shor's algorithm to RSA-1024 would require a couple of thousand qubits. On the other hand, the qubits would likely have to be kept coherent a lot longer in the AES case than in the RSA case, since the factoring algorithm is polynomial-time once implemented, whereas Grover's algorithm, as you point out, only reduces the brute-force time by an exponent of one-half. So depending on what quantum advance you consider more likely--superposition scale or superposition longevity--you may want to pad your asymmetric or symmetric key lengths with very different safety margins.

I have to say that the last sentence of Dan's comment should be up for security blog comment of the year. It's got quantum physics, it's got cryptography, it's got decision under uncertainty. All in one nice package.

Eric, I have made a few posts on this topic last year and here is a quote (http://lukenotricks.blogspot.com/2008/12/spin-on-passwords-and-aes.html)

"Nonetheless AES-256 is being widely deployed since it conveniently lies at the intersection of good marketing and pragmatic security. In upgrading from AES-128 to AES-256 vendors can legitimately claim that their products use maximum strength cryptography, and key lengths can be doubled (thus squaring the effort for brute force attacks) for a modest 40% performance hit."

I agree that the only risk that requires moving to 256-bit keys is quantum computing. I think we introduce a reputational risk problem by using 256-bit keys since we are claiming that the cipher can resist any attack that uses materially less than 2^{255} resources. So an attacker who can find an attack using say 2^{200} resources can claim to have "broken" AES-256. Trying to control the publicity of such an "attack" would be quite hard.

rgs Luke

Leave a comment