Hacker News new | past | comments | ask | show | jobs | submit login

I can't find mention of it online, but back in 2013 Lockheed Martin purchased a D-Wave machine because they wanted to use it for "AI", which it turned out meant software verification (of fighter jets?) I believe by searching for the possibility of some kind of invalid program state in a large program, which IIRC they couldn't manage to solve with standard solvers. But in that case the number of qubits in a D-Wave machine appears to me far too few for that to be possible, although I don't know the task exactly.

If by "AI" you include operations research (as opposed to statistical machine learning), yes, adiabatic quantum annealing makes sense for certain optimisation problems which you can manage to naturally formulate as a QUBO problem. By 'naturally' I mean it won't blow up the number of variables/qubits, as otherwise a far simpler algorithm on classical computer would be more efficient. I know someone who published QUBO algorithms and ran them on a NASA D-Wave machine, while I myself was using a lot of annealing for optimisation, I didn't want to get involved in that field.

But if you want to do machine learning on a large amount of data using quantum annealing, no, that's terribly badly matched, because the number of physical qubits needed is proportional to the amount of data you want to feed in.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: