Hacker News new | past | comments | ask | show | jobs | submit login

> Adam Caudill, the security researcher who blogged about the mass misissuance last weekend, pointed out that it’s easy to think that a difference of 1 single bit would be largely inconsequential when considering numbers this big. In fact, he said, the difference between 2^63 and 2^64 is more than 9 quintillion.

Okay, but, that's because 2^63 itself is more than 9 quintillion. Where the search space was previously 18 quintillion, it's now 9 quintillion. Both of those are "big". The attack is 50% easier than "theoretically impossible before certificate expiration," which should still mean that it's impossible.




Or, if the safety margin here is really only one bit, we should probably increase the minimum. If 63 is unsafe today, 64 will be unsafe tomorrow.

If you discovered your AES key generator only created 127 bit keys, would you correct the mistake moving forward? Or go back and immediately burn everything with the old key? The difference between 2^127 and 2^128 is much, much more than 9 quintillion.


Moreover, a Biclique attack against AES exists, by saving some meet-in-the-middle computations, it has already reduced the full 10 rounds, 128-bit AES to "just" 126-bit (25% of 128-bit) of security. Is it a clever attack? Yes. Does it mean the security of AES has been reduced to 25% of the original security level? No. Does it practically matter? No. This is exactly why 128-bit security is seen as a minimum standard in cryptography - it can provide a more-than-adequate security margin which renders all minor speedups in cryptanalysis irrelevant.

If the 64-bit random serial number has already provided an adequate security margin, it should be that no action needed for all existing 63-bit certificates. But it seems the choice of 64-bit here is arbitrary without good justification...


Does it mean the security of AES has been reduced to 25% of the original security level? No.

I'm curious why that's the case. A plain reading of reducing the security level from 128 to 126 bits would seem to imply the answer is yes?


Because going from "unbreakable in 12 billion years" to "unbreakable in 3 billion years" isn't a practical reduction in security


But that’s still 25% of the original security...

I get that it’s meaningless - 4x effectively 0 is still effectively 0 - but denying the math doesn’t really help anything.


I agree.

The problem here is my choice of an ambiguous word, "security". Formally speaking, the "security level" or "security claim" of a cipher is defined by the computational complexity (time/memory) of breaking it, often represented as the number of bits. so the Biclique attack indeed reduced the "security" of AES to 25% of its original claim. "Security" in a broader sense can be roughly understood as "how well a system is practically protected, under a specific threat model", in this case, the underlying details, such as this minor reduction to a cipher's security claim hardly matters.

I should have edited my comment to use a better word, but now it already became permanent.


The “security” of an algorithm is not defined as the duration of time required by a computer to brute force it. Much more important is how safe it is against other known or anticipated attacks.


Brute force attacks are now 4x as effective as they were once thought to be, but they are not the limiting factor for AES' security, even at 126 bits. The most likely way for AES to be broken would be a new algorithmic innovation that worked against any key length, or a new kind of computer, or an implementation flaw, or... , and those things are not 4x as likely than they were.


Instead of measuring "How many years does it take for me to crack this?" measure "How many actors would be able to crack this?" it turns out if you can crack 126, you can crack 128, so the pool of perpetrators to fear remains the same


It seems unlikely that there was process that determined that 2^63 was an insufficient number of outputs, but 2^64 was just right. The choice was somewhat arbitrary in the first place.


Its 50/50.

Either you crack it or you don't.


Except that's not how probability works.


By that logic lottery tickets would be 50/50 as well; either you win or you don't. That's not how it works


"50% easier than theoretically impossible" means it's now 50% possible, doesn't it?


Nope, the chance of success for each attempt went from 1 in 18,000,000,000,000,000,000 to 2 in 18,000,000,000,000,000,000.


that's from theoretically possible to theoretically possible.


Which is grammatically consistent with the assertion that fundamentally nothing has changed: certificates with a 63-bit serial number are unlikely to be compromised and certificates with a valid 64-bit serial number are also unlikely to be compromised.


Not 1 in 18 quintillion to 1 in 9 quintillion? I think you've got your binary math wrong.


2/18000000000000000000 == 1/9000000000000000000


2/18 and 1/9 are equal.


No more than "half of infinity" is half finite.


Infinity and "practically infinity" aren't the same thing though. Half of "practically infinity" may end up being practical.


Yes the previous value was not infinity. It was impractical to solve in a human lifetime, but if they keep trimming off a few bits it very quickly becomes practical. If actually "infinity" then dividing it by any finite number would still result in infinity, which is not the case here.


Right, which is why the specific claim here is that 63 is not a problem, not that smaller numbers in general are not a problem.

A better way to put this: instead of saying "it reduces the search space by 9 quintillion," say "it reduces 50% of the search space." Sure, that's a lot, but not nearly as much as trimming 8 bits and saying "it reduces 99.6% of the search space."


> trimming off a few bits

i.e., reduce it by close to practically infinity?


There should be a very large gap between "theoretically impossible" and "practical". If cutting the search space in half gets you from one to the other, there's probably been an error in definition.


Who knows what the future would bring?

A $32 million (1985 dollars) Cray 2 super computer could do 1.9GFlops.

You can now get over 50x that performance for less than a grand in a device that fits in your pocket. I bet those engineers didn't expect that in half a lifetime.


That's rather beside the point. If 63 bits is insecure, then 64 bits is also insecure. If I can brute force 63 bits in a week, I can brute force 64 bits in 2 weeks. If we are worried that 63 bits is a security issue, then the solution isn't increasing to 64 bits, it's increasing to 96 bits, or 128 bits.


Moore's law was described in 1965 and the experimental evidence lined up for well past the next two decades. If you handwave exactly what it means to "everything is 2x better every 1.5 years," we'd expect a factor of 2^(30 / 1.5) = 1 million by 2015, so having a factor of 100,000x in cost and having it fit in your pocket wasn't actually unexpected.

Certainly any cryptosystems designed in 1985 that wanted to encrypt data until today should have taken the most aggressive form of Moore's Law into account.


I’ll worry about that next time I issue a 30 year certificate.


Nassim Taleb would probably disagree. ;)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: