> And that's ignoring light-speed communication delays between parts of the computer, which would dominate.
Light speed delays are not relevant to a highly concurrent problem. They would be an issue for a general purpose computer that size running a sequential program.
True if it's not a quantum computer. For a quantum computer, the entanglement effects propagate at the speed of light, so you can't just treat it as independent trials. Of course if it's lots of quantum computers you'd get some advantages, but you can't run Grover's on an ensemble so I'm not quite sure what the resulting complexity would end up being (you can run it on each individual QC, but I don't know how the resulting complexity would be calculated).
Either way it's a bit beyond what's economically possible for any human organization right now. And I implicitly assumed the computation is fully reversible and therefore took negligible energy.
“We extend the analysis to the case of a society of k quantum searches acting in parallel”.
Disclaimer: I know absolutely nothing about the topic, but the first link I googled seems to justify my intuition that this decryption could be partitioned so that many quantum computers could run in parallel (thus avoiding the limit on speed of information transfer you are hypothesising).
> Either way it's a bit beyond what's economically possible for any human organization right now. And I implicitly assumed the computation is fully reversible and therefore took negligible energy.
Light speed delays are not relevant to a highly concurrent problem. They would be an issue for a general purpose computer that size running a sequential program.