Probably the most important application: quantum simulation. Solving any but the simplest quantum mechanics problems takes more computation than we can muster, but a quantum computer could do it easily. (Iirc, Feynmann wrote about this idea.)
"According to IBM, 250 qubits would be able to store “more bits of information than there are atoms in the universe.” This in itself is truly awesome — but then when you factor in that a quantum computer could perform logic on all of that data, in parallel, instantaneously"
If this is correct, does this really imply that there are more dimensions than the 4 we are aware of?
Shouldn't the calculations then be seen in other those extra-dimensions as well? (Thinking that this could be used to detect extra dimensional life and perhaps communication with them)
Or does it simply mean that matter has more states than there are particles in the universe? :-/
A quantum computer takes advantage of quantum indeterminism. In a classical computer bit is 0 or 1. In a quantum computer, a bit can be indeterminate.
If you have multiple bits, a classical computer is in one combination. A quantum computer can be indeterminate between a set of states chosen out of the set of possible patterns. Thus 250 qubits can be indeterminate between some subset of 2^250 possible states.
The downside is that when we input the start of the computer, we start in one state. When we measure we see one state. And quantum computing has to go by the logic of quantum computing. For instance this means all logic has to be reversible. Thus there is no "if A or B then C" because if you wind up at C you can't get back to your original state.
So for practical purposes a 250 qubit computer can start with 250 bits of data, ends up with 250 bits of data, and in the middle does an unbelievably parallel computation.
Not quite. The point is the qubit also holds the information of all operations applied up until the point the sequence is completed. The limit on that is the time to decoherence, not the number of qubits (although the more you have the harder it gets to stay coherent).
That does mean that the result you get from the calculation can be nothing more than the classical equivalent, but the amount of inputs to that result, and the resultant enormous explosion of processing you can do on it (each operation yielding potentially exponential growth in evaluated possibilities), is dependent on other factors.
The original IBM press release says "a single 250-qubit state contains more bits of information than there are atoms in the universe." Then the author tries to rewrite the press release but in this case what probably seem like small changes to him like "contains" to "stores" turn the sentence from being an essentially correct simplification to being utterly wrong. Maybe blatant press release rewrites are necessary evil in science journalism but its depressing to me that most publications don't even bother with a fact check.
Anyway, that's not quite true, you can get there by giving up principle of locality or counterfactual definiteness. And in theory nothing requires the principle of locality to be true.
Because if you need to "store" the information, as the author stated, you would need some kind of mechanism for doing so. It would be impossible to have such a mechanism that would be less than an atom in size, otherwise representing the set would be a superset of itself (ie: you can't represent more atoms than the universe contains if the cost of representing one atom is more than one atom).
In reality, what the author meant was "contains," which means you have a bunch of different bits with multiple simultaneous values (I believe this is distinct values in many different dimensions, but please correct me if I'm wrong), and the number of possible combinations of those values (since the combinations all exist "at once") can be more than the number of atoms in the universe. Hope that makes sense.
Exactly. It's not like Vinton Cerf and Bob Kahn et al could foresee what they were facilitating for when they built the Internet. Bring on quantum computing, and lets see what awesome stuff the rest of us can build on top of it
I'm pretty sure once quantum computing takes off engineers will be able to safely port an already existing encryption protocol onto it, it is not a major concern to me.
The point being made is that public-key encryption techniques assume that it is difficult to factor large numbers. Even with modern computers, this takes a very long time. But this wouldn't be true with quantum computers.
The most popular and well-studied public-key encryption systems use factoring, but there are lots of other proposed schemes. The ability to quickly factor products of large primes would cause a temporary upset while alternative systems are brought in as replacements.
Its not quite as easy as just swapping out schemes. As I understand it a practical quantum computer would break all proposed public key encryption methods except those based on lattice problems. Lattice based cryptography is still in the preliminary stages of research and is at the moment extremely slow compared to the currently popular encryption methods.
One of the things quantum computing might be applicable to is protein folding. If we can compute exactly how a protein would/should fold in linear time... well that could potentially lead to a cure for EVERYTHING.
For most problems, quantum computing is not faster. But when it is (theoretically) faster, it is much faster. Perhaps we may see quantum coprocessors one day.
We have seen already with the slow adoption of multi-processors and 64-bit computers that being able to run current software is required for any successful PC architecture. Even the devices currently replacing PCs for specific tasks still run OSs and software that's much closer to what we ran on our Apple II's decades ago than whatever that would orchestrate a purely quantum personal computer.
Devices like the one IBM demonstrated will probably start as peripherals to more familiar computer architectures, much like our PCs have GPUs that vastly outperform the processors that control them, yet, are no more than assistants in the general operation of the computer.