You only need to generate your key pair once. I just generated a 16kbit key in 10 minutes. Why is this impractical? The cost of encrypt/decrypt would be modular exponentiation which isn't that bad. My point is though that as long as there is some number of bits where RSA is secure and it's use is practical then RSA isn't dead. A lot of the discussion seems to imply RSA will be dead soon as a result of recent work. Maybe it will or maybe it won't but it seems the claims are a bit overblown. I agree with the idea that we need to design to the possibility of RSA falling (or ECC falling) within reason but RSA could have fallen 5 years ago and it could fall tomorrow (and so can ECC).
Well, now you're changing the goalposts. You originally said 64kbit, not 16kbit. Key generation is roughly O(n^4) in the bit size. If you quadruple the key length, the time goes up by a factor of 4^4, or 256. Key generation time is not irrelevant -- waiting ten hours before your SSH server can come online is no one's idea of a good user experience.
At the 64kbit-level (again, BIG BIG difference from 16kbit), even the cost of encryption/decryption alone is easily going to cripple a high-traffic server.
I apologize for changing the goalposts (slightly). I was using ssh-keygen and 16kbit was the maximum RSA key length I could use. Running the numbers, if we assume L(1/4) then 4kbit keys are slightly stronger than 1kbit keys today. 8kbit keys are very slightly stronger than 2kbit keys today. Please verify this, it's possible I made a mistake. L(1/4), 8kbit would be ~5*10^10 harder to factor than 1024 k-bit keys today. I think I now have the goalpost where it should be for a meaningful discussion? Another interesting question is why did Google choose to move to 2kbit rather than to ECC?
Attaching specific numbers to an expression like L(1/4) is meaningless, because L(1/4) is an asymptotic expression. Asymptotic expressions contain implied constants, whose values are not explicit. There's also no particular reason to expect the L(1/4) attack to apply to RSA. For one, I explained in another reply that we currently do not know any direct relationship between RSA and discrete log. In addition, Joux's latest result is not even L(1/4), it's L(epsilon) for any fixed epsilon > 0 (and in this context, the implied constant cannot be ignored, otherwise you get nonsense values).
You're right and I understand that. I was assuming some hypothetical attack that is equivalent to GNFS. That assumption has no theoretical basis, it's just used as a mechanism to question the ideas and get some intuition. I'm using the same line of argument used in the original presentation where they show a progression and make assumptions about what that means about RSA keys. I think that line of reasoning is odd but even under that line of reasoning the results don't make sense to me.