> how to scale the idea of "expensive to break" into even the near future
This part isn't actually that hard. Suppose your computation costs $1M now, and Moore's Law continues doubling computation per watt every 18 months for the foreseeable future.
Then in 15 years, the same computation will cost 1/1024 as much as it does today. So you're paying about $1000 per message for 15-year-old data.
In 30 years, you'll be paying about $1 per message for 30-year-old data.
Is that good enough? It totally depends. For most messages, it's almost certainly fine, because the value of the message decays over time. For other messages that hold their value, you need to set the initial cost higher. The formula above shows how you could that to achieve a certain cost at a time a certain number of years in the future.
As to the rest of your comment - this is not how we do science (or engineering). If we cried and gave up every time we encountered a hard problem, we'd still be living in caves, hitting each other with rocks.
You can't predict the future of computation. Look at the cost curve of Bitcoin mining. You may say you ("you" not "we") will learn from that. You are suggesting it's ok to trust your assessment of encryption strength. Show me a multi-decades track record of your predictions holding up when cryptographers designing secure systems don't have such a track record. Your idea amounts to Russian roulette where the gun gets more bullets over time.
Yes and no. Can we predict it exactly? Of course not. But we can estimate the rough order of magnitude with high confidence. It's not hard to find charts showing transistor density over time, and Moore's Law holds up pretty well since the 1970s.
If anything, the two big changes on the horizon are (1) quantum computers and (2) the end of Moore's Law. (1) is really hard to predict, and it will make all our current techniques vulnerable anyway. When (2) finally happens, it will only make future predictions easier, not harder.
> Look at the cost curve of Bitcoin mining.
Sure. Mining was expensive, so people put in a lot of work to make it more efficient. Now it's more efficient, but the easy optimizations are gone. ASIC miners are something like 20000x more efficient than recent CPUs. Do you really think there's another 10000x hiding in there? 1000x? 100x?
For comparison, AsicBoost was (is?) regarded this huge big deal, and it only yields about 37% improvement in mining efficiency.
> Show me a multi-decades track record ... when cryptographers designing secure systems don't have such a track record.
I know this is not quite what you asked for, but you don't have to look very far to find an encryption primitive that has "stood the test of time" cryptanalytically. It's called DES, and it's older than most people posting here. The only reason we don't use it now is because it uses tiny little keys that can be efficiently brute-forced.
As you mentioned, that is not what I asked, and your answer is not relevant to the inherent weakness of treading on the edge of weakness in encryption when even those specifying secure key length guidelines must rethink what is secure.
This part isn't actually that hard. Suppose your computation costs $1M now, and Moore's Law continues doubling computation per watt every 18 months for the foreseeable future.
Then in 15 years, the same computation will cost 1/1024 as much as it does today. So you're paying about $1000 per message for 15-year-old data.
In 30 years, you'll be paying about $1 per message for 30-year-old data.
Is that good enough? It totally depends. For most messages, it's almost certainly fine, because the value of the message decays over time. For other messages that hold their value, you need to set the initial cost higher. The formula above shows how you could that to achieve a certain cost at a time a certain number of years in the future.
As to the rest of your comment - this is not how we do science (or engineering). If we cried and gave up every time we encountered a hard problem, we'd still be living in caves, hitting each other with rocks.