Hacker News new | past | comments | ask | show | jobs | submit login

For advances in computing, "double the performance every 18 months" would seem like a good upper bound.

Algorithm advances are harder to predict, but choosing something dependant on a provably hard mathematical problem would be one approach. Another approach would be a construction that is secure if any of a set of base encryption/hashing primitives remains unbroken. (so an attacker has to break MD5/BLAKE/Whirlpool/SHA256 all at the same time to break your hash). One can probably draw some kind of 'survival' curve to predict how long a given algorithm will stand up before failure, and therefore predict how long a set of say 10 or 20 algorithms would survive.




>For advances in computing, "double the performance every 18 months" would seem like a good upper bound.

But not a realistic one past the mid '00s. Due to the end of Dennard scaling[1] that sort of exponential performance increase is no longer available. At this point a fundamental breakthrough (quantum computing for example) would be required to get a large and cheap increase in computing capability. That puts the hardware side in more or less the same situation as the algorithmic side. A fundamental breakthrough is required. That does not help with the uncertainty here. This stuff is more or less impossible to predict.

[1] https://www.extremetech.com/computing/116561-the-death-of-cp...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: