Hacker News new | past | comments | ask | show | jobs | submit login

Can someone chime in with how quantum computers effectively at least double the output of traditional "classical" computers? I legitimately don't understand why its not easier and in some bottom-line sense cheaper just to like double up your (classical) computing power. What is it about QC that's so damn sepcial when you could theoretically achieve the same maximal idealized output with a simple increase in classical inputs?

It seems with QC, there's this collective magical thinking that they offer this magical free lunch that is inaccessable via more traditional means. I don't get what the seemingly non-existent tradeoffs or optimizations that are being made to achieve this...




>Can someone chime in with how quantum computers effectively at least double the output of traditional "classical" computers?

In general: Quantum computers halve the search space, not speed, so for example quantum computer needs 2^128 operations to bruteforce 2^256 keys [1]

In case of non quantum resistant algorithms (for example RSA or most other popular algorithms today): there are algorithms that offer exponential speedup, where even a slow (in terms of number of operations/second) quantum computer will easily outperform a classical comoputer

[1]: Grover's algorithm. I'm oversimplifying a bit.


Quibble: Half of 2^256 is 2^255. Grover’s algorithm “square-roots” the search space.


I'm speaking in very abstract terms as a non-expert, but the critical distinction in this case is that prime factorisation (the basic underpinnings of RSA / similar encryption) is known to be an NP problem (more precisely - sub-exponential) for a classical computer but is polynomial for a quantum computer

To achieve the same computing power on a classical computer, it would need to be exponentially more powerful - this is because (abstracting heavily) the quantum computer can test all possible inputs to a function at once, where a classical computer must test them one at a time


Weren't some of these widespread conventions in some sense strategically designed or implemented in such a way as to ensure backdoors and/or contrived vulnerabillities? Something, something purposefully smaller key sizes or special "weaker" variables than was practicable or other trickery that always ostensibly has an economic or other seemingly justifiable underpinning but introduces unacceptable security compromises that arise later and predictably.


Historically there have been restrictions on key size (and bad algorithms) - to my knowledge, neither is currently the case: there is no known way to break a 2048-bit RSA key (although if there was, we probably wouldn't know about it)


The general purpose quantum magic is Grover's algorith https://en.m.wikipedia.org/wiki/Grover%27s_algorithm.

Grover's algorithm is a general solution to the search problem. Given an arbitrary (computable) function f: X -> Y, and a desired value y in the codomain, find a value x such f(x) = y. For the sake of simplicity, assume that x is unique (although a Grover's algorithm can be extenef to not have this constraint).

Let N be the size of the domain X. On a classical computer, without any additional information, the optimal solution is to simply iterate through X trying inputs until you find the correct one. On average you will need to evaluate f on half the possible inputs, which is N/2. In the worst case, you need to try all N.

Using Grover's algorithm, you can solve the problem with only sqrt(N) invocations of f, which is a quadratic speed up compared with classical computers.

Applying this to encryption keys: for a keylenth of n, we have 2^n possible keys. Classically speaking, you would need to try decrypting (2^n)/2 = 2^(n-1) times. But with Grover's algorithm, you can find the correct key after decrypting judt sqrt(2^n) = 2^(n/2) times, which is the classical worst case for a key of half the length. Note that this is still exponential in the size of the key, so is not a fundamental game changer.

This has nothing to do with prime factoring or the discrete log problem. Those have even better quantum algorithms (Shor's) that can solve them in polynomial time, which would be a major game changer if they ever become practical.


Algorithms have been developed for quantum computers that could be potentially effective at breaking public key cryptosystems whose security relies on the difficulty of solving certain math problems on classical computers. In particular, the discrete logarithm problem and integer factorization of very large numbers: https://en.wikipedia.org/wiki/Shor%27s_algorithm.

There are no quantum computers large enough to even come close to attempting this, and there's a question about whether it's ever going to be physically possible to build a quantum computer large enough to actually attack something like a TLS key exchange in real time. Classical computers are equally as fast/faster at other tasks.


Is it a fair conjecture that in some sense there's an issue with cryptographer's trying to push the field in terms of the efficacy and robustness of encryption forward while "the government"s continually work all manner of trickery to hamper these efforts in subtle and not-so-subtle but gag-ordered-enforced ways?

I feel like there's this constant ridiculous pushback on any digital product or protocol or service being air-tight cryptographically and implementationally speaking when they can basically already build air-tight cases via parallel construction with the help of the infinite resources available upon (often not even) receipt of a warrant?

Its very strange. The obsession is always on completely neutering/compromising the technology and never on actually doing the damn police work they are enpowered to approach laterally like they did before typewriters and telephones/wire-taps or bending the providions of constitutions they swore to protect and enforce until its a simulacrum of its original concept.

Like, it always comes across as they feel that their entire case is lost if they can only prove something 5 different ways instead of 6. It wouldn't be so problematic if humans weren't so human and law enforcement wasn't emphatically staffed by humans who are liable to abuse things to maximize their money, power, and prestige and have the absolute or qualified immunity to get away with it at least once regardless of how it damages the targets of their misconduct.


>Like, it always comes across as they feel that their entire case is lost if they can only prove something 5 different ways instead of 6.

There are a lot of crimes these days without a complainant. The sale and consumption of illegal drugs for instance. Surveillance can be very important in discovering the crime in the first place.

It's like asking someone to help you with your diet and then becoming upset when they invade your privacy by looking in your kitchen cupboards and fridge. We have in a sense brought this on ourselves by passing laws intended to protect us from ourselves.


Did anybody ask governments to “help with diet” though? It feels more like governments in their ever expanding power struggle decided diets needed to be fixed even though nobody complained about them and then decided that the ends justify any means.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: