This article is pretty inaccurate in a number of ways.
The whole D-Wave thing is mostly irrelevant for cryptography, which the article doesn't mention. Even if the D-Wave devices turn out to be useful for some special algorithms, they can't run Shor's algorithm, which is what endangers crypto.
Also calling 1024 bit a "really long key" is a bit strange. They are already endangered by classical computers, no need for quantum computing here.
That said: The call for postquantum crypto is right.
The definitive test for Quantum computer arrival is the claim of Satoshi's 1 million bitcoins. Those coins were mined and paid paid to the pubkey, not the pubkey-hash. Thus they are quantum computer vulnerable.
I don't know about that; the moment those coins are suspected to be compromised, you'd get a near-immediate run on BTC. So yes, there are a lot of Bitcoins available, but stealing them would undermine the entire system their value is based on, rendering them worthless.
They wouldn't be rendered worthless, but the security would take a hit for sure. Any coins mined (and not moved) before 2012 would be at risk, in addition to any address that has been reused since then (revealing the pubkey, instead of just the pubkey hash). Addresses created post-2012 that have not been spent from (spending reveals the pubkey) would still be 100% secure.
Eh -- "secure-ish" at best. Spending those utxo's requires revealing the pubkey, at which point someone with a quantum computer can just break your pubkey and offer a new transaction with a higher fee while your transaction is still in the mempool.
Or an even bigger prize for keeping it secret. If you could break asymmetric encryption at will you could plunder much bigger prizes than Satoshi's coins.
Like what exactly? We know many things are secured that way, but few are secured in only that way. I only know of things that take more effort to generate profit (and generally less profit).
can someone explain why it can't run Shor's? I thought Shor's was an algorithm designed to be run on QC, and D-Wave is a QC. What makes D-Wave different from the intended hardware?
D-Wave doesn't have typical quantum logic gates. They are a different kind of quantum computer called "adiabatic quantum computers". Instead it solves optimization problems by using a technique called "quantum annealing". In essence, it's a tool that can very quickly find maxima and minima of certain cost functions you can program in (albeit tediously).
This is slightly inaccurate. Adiabatic quantum computing is not the term you want. Adiabatic quantum computing is equivalent to the more traditional quantum-gates based computing model (both of them are efficient at the same class of problems). D-Wave can solve one particular problem (that can also be solved by classical computers in similar time), the problem is related to annealing, and is not the problem of adiabatic quantum computing (which is much more general).
The component they lack is that they can not preserve any non-trivial information between their qubits.
I am not so sure. I don't understand work well but there have been a couple papers saying they can get a quantum speed up on factorizing using adiabatic quantum computers [1]. Further, Geordie Rise himself said that they have an algorithm that can run on a dwave system that can beat shors algorithm [2]. A more skeptical take is here [3].
Scot Aaronson does understand this work very well, and he mentions several times (with citations) that there is no widely accepted proof or disproof that a quantum speedup will ever exist for quantum annealers.
1024 bit keys are endangered? Only in some broken encryption schemes. You can't even break 256 bits, as the number of possibilities exceeds the number of atoms in the universe.
You're comparing keys from two different cryptosystems, which is a mistake. A RSA key ("1024 bit keys") needs to be longer than a AES key ("256 bits") to provide a given amount of security, since more conditions are placed on an RSA key in order to get its favorable properties.
When you pick a key length, you want it to be long enough to provide security for your communications as far into the future as practicable, and the classical efforts to break RSA are getting uncomfortably close to 1024-bits. Maybe in a few decades it would be practical for some adversaries to break even perfectly made 1024-bit keys.
However, I'm not a cryptographer, so someone please correct me if I'm wrong.
I mentioned your first link, the factoring of a 768-bit RSA key, in my first comment. RSA isn't a "broken encryption scheme": it's a perfectly fine one. It's just that you need to use key lengths long enough to defend against the computing power available to current and expected adversaries. That paper is just a demonstration of modern computing power, which just shows that you should use longer keys.
Your second link has garbage sensationalized headline. It actually describes a side channel attack that has nothing to do with cryptography algorithms. It's basically equivalent to a clever way of looking over someone's shoulder.
They'll all break eventually ;-) — We know that, that's why encryption progresses. The question is only: How fast do we have to progress in the future?
The issue IS not 256-bit it is the password you used to encrypt it with. Using letters, caps, symbols and numbers with 12 that would be an incredibly large budget to get your password.
What should a dev do to stay current, given this is happening? Say you have an interest in algorithms in general, what do you need to read / what can you read that connects to existing knowledge?
TL;DR: if you're not retired before QC becomes useful (which is likely), the "API" will have changed so much from now till then that you shouldn't worry about it.
Think of this as an algorithm on specific hardware for solving a specific problem. You can encode other problems into the specific problem, at a cost. Currently this algorithm on specific hardware is not as fast as state-of-the art algorithms on ordinary hardware (namely Selby's). But the people making this special hardware claim (no proof) that when the special hardware (and the problem size) is scaled up, the special hardware will be much faster than ordinary hardware. So far no-one has disproved this assertion, but no-one has proven or demonstrated it either.
Meanwhile, many other scientists and some companies, who refer to this type of special hardware as "dirty QC", are working on another class of special hardware they call "clean QC". This is mathematically proven to be much faster than ordinary hardware when it is scaled up, but it's very very very difficult to scale it up. But they're working on it and showing progress.
Right now, no-one knows if the "dirty" or the "clean" special hardware will be the first to be actually useful in the real world. It's likely the first useful real world application is more than ten years away. We don't know exactly what the application will be, or how the special hardware then will be programmed. It's also quite likely that most the people commenting here will be retired before access to the special hardware will be given to anyone but scientists.
Yea, that must be what he's referring to. I know a lot of folks who work on this stuff and they never use "clean"/"dirty" terminology (although it's reasonably apt).
FYI, Scott was using the quotes to try and indicate that he was making those terms up on the spot (so it's not accurate to say that "many other scientists and some companies" use those terms). But I can see how it might not be clear from reading his comment.
If you want to access faster computing regimes that already exist, learn GPU programming or FPGA programming. Both provide extremely large proven speedups over CPU programming and you can start learning them now. Worrying about Quantum Computing is, imo, rather premature.
I think you have a wrong idea here about quantum computers. It is not likely that they will end up on your desk any time soon.
What is likely going to happen at first is that for very specialized problems there will be large scale quantum computers. Think of them as complex physical experiments, occupying whole rooms with delicate equipment. That will be the reality of quantum computing for quite some time. If they ever end up on a desk is hard to tell, but that's really science fiction, it's like asking whether we'll visit foreign planets in the future.
That means unless you are working in a very specialized environment quantum computers are irrelevant for you as a developer except that you'll likely use different crypto algorithms in the future. But you will use different crypto that still runs on classical computers.
Pardon my ignorance on the matter but, in regards to your last sentence, you mean it'll be possible for classical computer crypto to withstand QC brute force?
QC don't "brute-force", that's an inaccurate description of how they work.
But to answer your question, yes, it is possible to have classical computers with crypto that withstand QC attacks. There's a whole research field on postquantum crypto, e.g. have a look at
http://pqcrypto.eu.org/
I would suppose that not much will change. If you're already familiar with working with the cloud (like AWS) this is just going to be some new node, with some new API that you use when you want to run some optimization algorithm. In fact, I wouldn't be surprised if AWS/Azure buy a bunch of these, and just offer them as one more service.
As long as you don't write low level algorithms, you will most likely not be affected as an engeneer. Imagine you call list.sort() now and in the future.
I would expect, of all places, people on HN to be smart enough to help me understand these news stories. I have no idea what quantum blahblah really means and want to know if this is a real breakthrough or just a news story.
This really is a real breakthrough in quantum computing. Specifically for an approach called 'quantum annealing' which is one of many (there is at least one other) possible approaches to quantum computing.
This result is a breakthrough in two ways.
1: It demonstrates real quantum effects in the D-Wave computer. It had been hotly debated, by experts in quantum computing, whether the D-Wave actually used quantum effects in its computations.
2: It shows a real speedup for a very specific instance of a very specific algorithm. This is great news, and a very good result, for the people working on the D-Wave.
This is just a news story because
1: The D-Wave is much, 100 million times!, faster than the same algorithm running on a conventional computer. But there are algorithms you can run on a conventional computer which are equivalent (If someone could clarify how equivalent they are that would be great) which are as fast as the D-Wave. So they chose an algorithm which is good for the D-Wave and terrible on a classical computer.
2: The problem instance they chose is very artificial and it isn't clear that the speed up wouldn't disappear if they tried to run actual real world instances of the problem.
I would conclude that this is a great result. It increases the understanding of quantum computing. That is very exciting. The D-Wave doesn't appear to be practically useful. Yet.
(I am not even the beginning of an expert in this, everything I wrote above comes from the link below)
My understanding: The D-Wave machine isn't general quantum computing, but hardware supporting one specific algorithm: quantum annealing. Quantum annealing is similar to Simulated annealing, and if you compare the D-Wave machine to simulated annealing implemented on normal, non specialized, hardware, the D-Wave is 100,000,000 times faster.
In principal, even if the hardware only supports one algorithm, this could tell us something about whether quantum computers are really better than normal computers. Unfortunately the D-Wave machine/algorithm is still not much better than simply simulating a quantum machine in normal hardware (Quantum Monte Carlo) and worse than the conventional algorithm by Alex Selby.
It's interesting, but it still doesn't shed much light on the important questions.
One more thing that you didn't mention, which I think is important: if you run simulated annealing on specialized hardware and compare it to the performance of D-Wave's machine, there is virtually no speedup for D-Wave's machine.
Sort. A counterpoint I read on this topic indicated it might be rigged a bit. The comparison was to an emulation of D-Wave's own annealing process on a single-CPU machine. That's... already an unfair representation of simulated annealing's performance on classical hardware. I'd compare the solution of a problem to optimized algorithms on NUMA machines, GPU clusters, and FPGA clusters that cost whatever D-Wave costs. Given that's all one knows about how it really works past their claims: input X dollars to get Y performance and Z energy-use. ;)
Note: One could also compare to performance of single machine with CPU's, GPU's, FPGA's, and/or ASIC's to be more fair. Just needs to be optimal, classical implementation on HW designed for it vs optimal use of D-Wave.
Ha, yes definitely. Thankfully the state-of-the-art in hash-based crypto has progressed a fair bit since the original lamport signature. The stateless "SPHINCS" signature scheme is looking quite mature already: http://sphincs.cr.yp.to
Although if you are working with an Event-Sourced architecture you may as well just implement with OTS (One-Time Signature) chains.
TL;DR: Google researchers demonstrated that DWAVE2X quantum annealing against one classical (i.e. normal computer) algorithm (on one CPU core) - simulated annealing, was asymptotically (i.e. more than just by constant factor) better, but was not against another classical algorithm quantum monte carlo (QMC) that is, the difference was constant. In fact, it was 10^8 times faster and that is generating all the hype.
Now, it was mentioned in the paper that Selby algorithm on classical computer would actually produce better results than DWAVE2X but it was not compared against it (maybe because it would minuscule the amazing 10^8 times difference).
Also it is worth mentioning that special solutions in silicon (ASIC) are demonstrated to produce about 10^6 faster results compared against the same algorithm on a single CPU core (it does not follow that such speed up is possible for every classical solution).
A mainstream media article about quantum computer that starts with a reference to D-Ware can be safely ignored.
Quantum computers (the real kind) are still impractical. People are still working hard to make them bigger, but this still gets harder and harder as the computer size increase. And people still don't know if that's an inherent difficulty of the problem, of if some breakthrough will suddenly make quantum computers easy to make.
Or, in other words, there's no visible change for people that is not closely following the area.
The Google/D-Wave stuff is largely without substance, but makes for nice, breathless media reports. See http://www.scottaaronson.com/blog/?p=2555 for an explanation of the latest report.
People are not convinced that D-Wave is really quantum yet. I've seen not D-Waves presentations at the annual Supercomputing Convention and I am not convinced yet.
We have quantum safe encryption now, just not pki quantum safe, although that is possible. We also have quantum key distribution which could be a route to patch things if quantum pki is beaten out by quantum computing.
qkd solves no problem that quantum computers generate.
At best you could replace a symmetric cipher with it (though huge costs and very impractical), but we don't have a problem with symmetric ciphers.
We have a problem distributing the keys for symmetric ciphers, which is why qkd would be useful if we can't do pki. I'm interested in the huge costs and very impractical part of your comment - why do you think that?
No, it's not coming. It will stay there where it is now - huge corporations that rule the world and governments. This will just build huge precipice between small companies and people who will run AMD64 for next 20 years and corporations that in 20 years will already have something that will replace quantum computers. We all will be ruled by a few computers. Google technologically runs away from their competitors. Google will gather more data, process it faster, more accurate, will gather more data... Poor will become poorer and rich will be richer. It's not coming fast, not in our direction.
We are not limited to run AMD64 cores. We can also have GPU cores and when some problems are really important to solve, we can run them on FPGA or create an ASIC for it.
Also we can expect that there would be computing centers what resources the mere mortals can access as a service as they can access the resources of the computer clusters today.
Very likely universities fill have their own quantum computers. Bigger ones first, then smaller ones.
I think that the situation is not that depressing but it definitely will create some disparity in the beginning.
In 20years we will have AMD128 and some ASICs for personal usage, Google will have something what will be 100M faster than quantum computer, it will be 100M * 100M faster than quantum computer. It will contain more computing power than rest of the world together. Every next generation of computers will be produces in shorter time than the last one, and will be more powerful than the last one. Cost of producing next generation computer will be cheaper then producing ASICs for us. Are bitcoin miners the more popular ASIC?
>Very likely universities
I don't mean no one else won't have or rent any, sure they will, there are a lot projects that will need it, we people won't have any. I think I presented my opinion too negatively :<
I don't mean no one else won't have or rent any, sure they will, there are a lot projects that will need it, we people won't have any.
If we look at the history of computing, then yes, one could be so pessimistic, but would it be really true?
If there will be a breakthrough in the QC, then there will be sudden economic motivation for people (companies) to use quantum computers.
This will generate the need for people who could work with such computers. This will again generate motivation for universities to train such people and the need to access quantum computers.
I think that this will happen much much faster than it happened with the classical computing.
Of course I also believe that there would be no personal QC any time soon if this is what you had in mind.
Edit: I also do not understand why you are down voted. I think that it is an important perspective.
Not a downvoter, but I think there's a practical limit to how useful the data collected by Google is - and the company is probably close to that limit already.
I'm not sure QC can change that.
The only way to get more value would be total 24/7 Orwellian surveillance, and I don't think that's going to be a popular option.
QC for crypto is a no-brainer. QC for anything else, including data mining/ML, is a much fuzzier prospect. I'm not sure anyone really understands what the practical applications could be, never mind how to use QC to make them possible.
Any suggestion that you can take a warehouse full of web logs and tracking stats, give it a quantum shake, and have a few million pre-qualified addicted customers fall out is likely nonsense.
QC for crypto is a no-brainer. QC for anything else, including data mining/ML
Problems in ML can be mostly proposed as an optimization task and if QC would works out, ML would be likely its main application.
It would likely make an huge difference in solving a classification problems as you could train your ML model with much more data much faster. I
t would also make the clustering problem much faster too as it also can be expressed as an optimization problem.
Also there are many other optimization problems that would benefit a lot from the speed up (regardless if it is huge constant speed up or an asymptotic one).
The whole D-Wave thing is mostly irrelevant for cryptography, which the article doesn't mention. Even if the D-Wave devices turn out to be useful for some special algorithms, they can't run Shor's algorithm, which is what endangers crypto.
Also calling 1024 bit a "really long key" is a bit strange. They are already endangered by classical computers, no need for quantum computing here.
That said: The call for postquantum crypto is right.