Hacker News new | past | comments | ask | show | jobs | submit login
Microsoft’s Quantum Mechanics (technologyreview.com)
131 points by finisterre on Oct 10, 2014 | hide | past | favorite | 37 comments



Some good news, the Majorana fermion has recently been independently observed.

http://www.sciencedaily.com/releases/2014/10/141002141757.ht...


How certain are the findings? It's so hard to tell with particle physics.


The Majorana discoveries are condensed matter physics, not particle physics.

In particular, when they say that they've discovered "a new particle" it's important to understand that this is not going to appear on the Standard Model of Particle Physics at any time soon (unless, say, neutrinos, which are already on that model, turn out to be Majoranas).

We've discovered excitations at the end of nanowires which appear to behave like Majoranas were supposed to, and we've worked out some theory which predicts that they were supposed to be there, and if they are there then they're "topologically" protected from a lot of the usual sources of noise, so we might be able to make long-lived qubits for a change. That's about what we know about Majorana fermions, minus some details.


They forgot the NSA under the Quantum Projects section.

http://www.wired.com/2014/03/quantum-crypto-google/


Only the lab in Silicon Valley was closed down. The main lab is in Redmond, and there are still labs in a variety of other locations:

http://research.microsoft.com/en-us/labs/default.aspx


FTA: "Freedman was 30 when he solved a version of one of the longest-standing problems in mathematics, the Poincaré conjecture."

Not true, he apparently made contributions to the poincare conjecture in dimension 4. Also, the article has a link from "poincare conjecture" to a clay institute webpage which is broken.

Meh


This is what it says about him on the official list of Fields medalists: "Developed new methods for topological analysis of four-manifolds. One of his results is a proof of the four-dimensional Poincaré Conjecture".

The article's statement you quoted is accurate enough to not be "not true".


Ok, thanks for that. I think I'm done with wikipedia.


Can anyone expound further on the practical applications of quantum computing? In my limited understanding, I think the following are definitely candidates (presented in no particular order), but I'm sure there are others:

1. Shor's algorithm could expose all encryption algorithms that are based on integer factorization.

2. Quantum simulation could open new avenues of research into how our universe operates at the quantum level. This could lead to advancements in materials science, for example.

3. Quantum computing could open new avenues of research into the P versus NP problem.

4. Quantum computing could open the door to the possibility of instantaneous communication via an understanding of action at a distance / quantum entanglement.

Edit: Thanks for all of the great responses, clarifications and links to further reading.


3. P vs. NP is a problem phrased in the language of classical Turing machines. In that sense, quantum computing has no bearing on it. It is unknown whether NP problems can be solved efficiently on quantum computers.

4. Is a common misunderstanding about entanglement. Unfortunately, while entanglement does allow non-classical long-distance correlations, it does not allow for communication. Think of it this way: suppose we took two random number generators that start with the same seed, and thus produce the same stream of bits. Then you take yours over there while I keep mine over here. This doesn't help us communicate faster, because we don't get to pick the bits that come out of the generator. So too with quantum entanglement.


Isn't one of the main idea of entanglement/superposition the fact that the qubits' values are not pre-determined? I.e. the two random generators don't have the same actual seed, but as soon as the first one is used, it generates a seed, and the second one will also use the same seed from then on?

I still have some difficulty understanding how that doesn't rule out some undetectable determinism, and how/if it's possible to measure if the superposition has already collapsed (i.e. the seed has already been determined), something apparently useful in quantum encryption.


http://en.wikipedia.org/wiki/Bell%27s_theorem tells us that no matter how much classical information you have, you still can't get correlations that are as good as quantum entanglement.


Right, s/he's giving you a sense of "classical entanglement", but quantum entanglement can also do some interesting things which classical entanglement cannot do. However, the scope of that is not as simple as "set those bits over there to an arbitrary state as seen from over here." In fact, if my bits over here are entangled with your bits over there, the entanglement manifests in spooky coincidences between our actions which you won't even notice if you don't bring our actions together and compare them. From our perspectives individually it looks like we're both doing random actions; but then you find out that when you bring the actions lists back together they were both the same; neither of us had the chance to predict or affect what the other one did but we both agreed on what we did.

Here's a game for a three-person team: they are all cooperating, but the game is called "betrayal" because we will secretly force one of them to betray the other two and measure how gracefully they recover from it. We will put the team through many "tests", if they win all of them, they get lots of money; you don't get any money for being a traitor in any given test.

Here's how this goes. We put everybody in relativistically separated rooms; each room has a computer screen and two buttons labeled 0 and 1. Once they're all isolated we give the group a test. Then they can come back together and collaborate before the next test, as they see fit.

Some tests are "control" tests. We flash on the screen, simultaneously to all 3 of them, the command "make the sum of your numbers even." We collect the 1's and 0's together, add them together, and they pass the test if it's even.

Other tests, we randomly choose one as a traitor. We flash on the screen to the traitor the same "make the sum of your numbers even" prompt, but we flash to the other two, "traitor round! make the sum of your numbers odd" -- and they pass the test if, when we add together the three numbers they press, that sum is odd.

There's no classical probability distribution on the six random variables A_even, B_even, C_even, A_odd, B_odd, C_odd which satisfies all of those tests 100% of the time, so that A_even + B_even + C_even is even but A_odd + B_odd + C_even is odd and so forth. Just add all of the equations together; you'll get 2 * (A_e + B_e + C_e + A_o + B_o + C_o) on the left hand side, but (even + 3*odd) on the right hand side and thus even = odd, which is impossible. There is no classical 100%-solution.

There is a quantum 100%-solution. If |+> is the state |0> + |1> and |−> is the state |0> − |1> then the (entangled, GHZ) state

    |+++> + |−−−> = |000> + |011> + |101> + |110>
guarantees that any measurement will have an even sum, while the state

    |+++> − |−−−> = |001> + |010> + |100> + |111>
guarantees that any measurement will have an odd sum. The two people who know there is a traitor in their midst can do the unitary transform which takes |+> to |+> and |−> to i |−> (which is a "controlled phase rotation" combined with some "Hadamard gates"), and this defining property of complex numbers that i^2 = -1 causes the total state to switch between those two parameters when any two people make those transformations.

So if they're playing with an entangled quantum state, two of them can make a local change to their own state which induces the right sort of change in the global state that, when we bring the data together and correlate, we find out that they can do something which classical observers cannot ever do: win a game with 100% probability. Each of them locally appears to be producing 0 or 1 with a 50/50 probability but globally when we compare their bits we find out that they could choose together whether that sum was odd or even in a crazy new way.


If we have quantum computers, then isn't P vs. NP the wrong question? With quantum computers, the more relevant problem to study should be BQP versus QMA (the quantum analogs of P and NP/MA, i.e., computable versus verifiable in polynomial time on a quantum computer). Are complexity theorists studying the wrong problem?


> Can anyone expound further on the practical applications of quantum computing?

My research is in molecular dynamics simulations and quantum chemistry. A lot of progress has been made in the field in the last few decades; however, there is a roadblock right now that no one really knows how to get around called the fermion sign problem. If we want to simulate atoms or molecules as accurately as we can, the number of resources we require scales exponentially with the number of particles. This is basically due to the Coulombic (electrostatic) interactions between the electrons in the atom. Accounting for these interactions gives many materials their interesting properties, but we've only been able to come up with approximations for solving these systems.

Quantum computing would allow these systems to be solved in polynomial time. The advantage would be enormous; in some cases, atomic simulation results would match experimental ones exactly†, and that is a very exciting notion.

(†glossing over a huge number of details here)


It seems clear that quantum computing would help with QM calculations (which I think you were mainly talking about), but it is less clear to me if it would help with things like MD (and perhaps it wouldn't matter if QM could be sped up enough).

Could quantum computing help with MD or other kinds of simulations (e.g. monte carlo) where we are essentially searching for a minima? I have heard that quantum annealing, which some argue the first QC system are really doing, would work on this. But it's all too confusing. It seems like half of what I read about quantum computing would say "Yes!" to that, and half is saying "you don't understand".


It could help with MD. A lot of MD potentials are developed by kind of regressing results from QM calculations for different atomic configurations (e.g. some versions of ReaxFF are based off of DFT calculations).

I actually don't know a whole lot about how exactly QC speeds up computations for things like Quantum Monte Carlo, I just know that an algorithm exists for it (much like Shor's algorithm speeds up integer factorization, but the details are murky for me). I'll have to see if I can find which paper that is.



#4 shouldn't be possible according to the No-Communication Theorem.

http://en.wikipedia.org/wiki/No-communication_theorem


Quantum Machine Learning would be drastically sped up, as many linear operations could be done in log(N): http://arxiv.org/abs/1307.0411

A quantum machine learning computer could make scientific advances simply by processing large datasets at rate dramatically faster than humans and/or current machine learning systems.


This one is good too:

"Quantum algorithms for topological and geometric analysis of big data" http://arxiv.org/abs/1408.3106

It uses some of this "applied topology", which I only recently found out about. http://www.math.upenn.edu/~ghrist/notes.html



I think I have some facts twisted up. I thought Microsoft closed it's R&D lab?

http://www.zdnet.com/microsoft-to-close-microsoft-research-l...

Do they have others?


Yes, there are MSR campuses around the world! Only the Silicon Valley satellite campus was closed. See http://research.microsoft.com/en-us/labs/ for the full list.


Yes, they only closed the Mountain View lab.


Does anyone know or can explain simplistically why if we can get supercomputer like power from 100 qubits then why not a desktop computer like power with 1 or 2 qubits? I assume that the computational capacity increases exponentially with the number of qubits, but how?


Well, we don't really know if it's exactly exponential, we also don't really know that a quantum computer is more powerfull than a classical one (and we don't know if P == NP, what's related).

That said, yes, the observed speedup on some [1] problems is exponential. It's easy to wave hands and say that when you place the qbits in a coherent state together, the operations you do on them apply to all of the values you they can possibly carry, and those grow exponentialy with the number of bits. But the long explanation makes things much more clear, and is worth reading if you want to really get it. Wikipedia is a good starting point [2][3].

[1] Not all by a wide margin. Quantum computers will be specialized machines for a long time, if not forever. But cryptography will take a huge hit.

[2] http://en.wikipedia.org/wiki/Quantum_computer

[3] http://en.wikipedia.org/wiki/Shor%27s_algorithm


A quantum computer isn't really much like a super-powered classical computer. There are certain specific things it's amazingly good at (most notably, factorizing large integers), some other things it's quite good at (e.g., some kinds of search operation), and lots of things for which no one knows how to make quantum computers any faster than classical ones.

So, if you have a quantum computer, you're probably going to want to use it for breaking RSA or similar cryptography by factorizing large numbers.

There are (ordinary, non-quantum) algorithms for factorizing large-ish numbers. They work pretty well for numbers of modest size. But they don't scale well as the numbers get very large. Something like exp(log(n)^1/3 log(log(n))^2/3) to factorize a number near to n. Or, in terms of the number of bits, something like exp(b^1/3 log(b)^2/3).

So this works fine for numbers up to, let's say, 200 bits on a single machine or 1000 bits for a team with a substantial network. (Note: all numbers here are kinda made up but of the right order of magnitude.)

With a quantum computer, you can factorize a b-bit number with something like 2b qubits, b^3 quantum gates, and a runtime that scales more manageably with b.

So there's not much point in using a quantum computer until the number of qubits it has gets large enough to use for factorizing numbers bigger than you can do on your PC. Which means somewhere on the order of 100 qubits. As the number goes up, the advantage of the quantum computer over the classical computer increases. The transition from "slower than your PC" to "faster than any classical computer that will ever be built" takes place over a fairly small range of qubit-count.

Down with much smaller numbers of qubits, though, the quantum computer can maybe manage to find the factors of, say, 15. Which is, shall we say, quite a lot less than the computer on your desktop can do.


I'm confused, you say you could factor a b-bit number with "something like 2b qubits", but then go on to say you'd only need 100 qubits to factor bigger numbers than a PC.

Factoring a 50-bit number is not that big a deal. I used my Python command-line calculator program (all the hard stuff is done by mpmath, pyprimes and other open-source code I didn't write) to factor a 50-bit number just now:

c:\>rpn -t 2 50 7 + factor

[ 37, 30429727211963 ]

29.569 seconds

I'm sure Mathematica is much faster.

It seems to me several hundred cubits would be necessary to outperform classical algorithms on commodity hardware.


Why on earth would it take 30 seconds to factor a random 50 bit number? You just check all numbers between 2 and 2^25 (about 32 million). This should take less then a second, maybe two in python.

The parent poster likely was thinking 100 bit number (200 qubits), which, while technically tractable, is quite a bit of computing power (assuming that we don't end up with huge coefficients with quantum computers).


As I said, all the concrete numbers in what I wrote are crude order-of-magnitude approximations. The original question was whether a quantum computer with just a few qubits would be comparable in power to a conventional desktop computer; the answer is a clear no whether the point at which quantum computers become genuinely useful is at (say) 100 or 1000 qubits.


With 1 or 2 qubits you wouldn't even be able to compute 2 + 2 because storing 4 in binary requires 3 bits. 100 qubits is also not particularly useful, not because they aren't fast but because most of the interesting stuff simply doesn't fit. RSA keys are thousands of bits long, for example.

You need enough qubits to store the problem, and the working state of the algorithm you're using, and the overhead introduced by error correcting codes, and the overhead from your gates not being ideal, and so on. Austin Fowler gave a talk a few months ago where he goes through the overheads in a particular problem (finding molecular ground state energies) and ends up optimistically requiring tens of millions of qubits [1].

1: https://www.youtube.com/watch?v=XbFoXT73xVQ&feature=youtu.be...


So, the deal is that quantum computers don't do everything for you. They can do some things faster than any classical algorithm, but they're not known to, say, do NP-complete problems in polynomial time. (Friendly reminder: we don't even know whether classical computers can do that or not.)

Supposing that you've got, say, three qubits which can all be 0 or 1, a classical computer can be in one of eight states 000, 001, ... 110, 111, and we can easily store one of those numbers in memory. For a quantum computer, we have to instead store 16 doubles in memory, because it is allowed to be in any state:

    |S> = a0 |000> + a1 |001> + a2 |010> + ... + a7 |111>
such that the complex numbers a0, a1, ... a7 square-sum to 1. In other words, for n bits you often need to store 2^n numbers.

There are rough-and-ready ways to get around a bit of this in simulation with a sort of 'lossy compression' approach, it's stuff like: you form the "state matrix" (basically, take your row-vector and form an outer-product with its conjugate transpose), diagonalize it, and keep only the eigenvectors for the largest eigenvalues. If your system is simple enough you might be able to just reduce the bulk of the calculation to "what's happening to these eigenvalues?" and you can leave out any explicit further representation of the vectors themselves.

Now, here's where this gets interesting: the world is quantum, especially when we're talking about molecular chemistry. So, we have these chemicals which obey quantum mechanics and they have this huge quantum state which is cumbersome to compute with. But with a quantum computer, we can just program those rules and that behavior into the computer and see what it does with the physics that our classical computers can only barely simulate. So that's why these quantum computers, when they're doing what they're good at, beat out our supercomputers.

But why can't they beat out our supercomputers unilaterally? Because that's just not what they do, and for the most part these 2^n complex numbers which tell us the quantum state are not directly observable.

Let me put it like this: at the end of the day, when you measure the |S> above in the computational basis, you will get a classical 8 bits of information and the quantum state will be destroyed: you measured, you found |010>, that is the state of the system now. Prepare millions of these and then you will find some statistics for |000> and |001> and those statistics will give you a few decimal places for |a0|, |a1|, |a2| respectively, with no information about the phase factors of the complex numbers involved. It's not terribly enlightening of itself.

But then there are some generic problems, like "what is the only n-bit input to this set of logic gates which makes the result bit come out 1?" where you can do better. The answer to that is only n bits, but our only classical algorithm searches through all 2^n settings of those bits to find the answer. It turns out, if you can make your logic gates act their normal way on a quantum state, then you can pass one of these a-little-of-everything superpositions as input, then do a little magic to the input bits based on the resulting output bit, some 'yadda yadda' about a quantum fourier transform, and poof: when you measure the bits you will, with high probability, get the bits which set the logic gates to 1. That's with only the one query actually sent to the quantum versions of those logic gates. You peek inside it once and the resulting state knows enough about how it was created that you can extract out something interesting about the function.

So for some algorithms like factoring integers and searching for special inputs to a process, you really do get "exponential speedups" from quantum computers. However, those algorithms are really special-purpose stuff; quantum computers are not known to make arbitrary computations faster. They will make some specific computations, like chemistry calculations, faster.

With four qubits we have been able to factor the 4-bit number 15 into 3 * 5. That's the biggest that I know we've built, although some companies (like D-Wave) claim that they have built bigger systems under a different model of quantum computation (and hence can't do the factoring algorithm straightforwardly). The problem right now is that you need these big experimental apparatuses to cool, manipulate, and read out values from these quantum computers, and while you're doing those tests the clock is ticking -- states don't stay 'coherent' for more than a fraction of a second, after which they act pretty classical. There's just not much we can do with these systems until we get them to be always-online and stuff.

Finally, 1 and 2 qubits don't really show off all of the diversity of quantum mechanics; for one thought-experiment that I like to use to show people the strangeness of quantum mechanics ("Betrayal"), I determined that you needed three people playing the game together, no less, to really get the paradoxical result.


Windows will only take 2 minutes to boot on a quantum computer then.


When was the last time you used a Windows PC? Windows 8 boots and shows you the login screen in under 10 seconds.


Why would you even do it? My Windows (same kernel) phone runs for months without reboots.


Compared to Linux it boots much faster and is about as stable, too.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: