Hacker News new | past | comments | ask | show | jobs | submit login
Quantum Algorithm Implementations for Beginners (arxiv.org)
191 points by lainon on April 12, 2018 | hide | past | favorite | 20 comments



The way this starts seems to tell a story that I feel is quite disconnected from reality:

> As quantum computers have become available to the general public, the need has arisen to train a cohort of quantum programmers

It seems to peddle the idea that in a few years we'll replace all normal computers with quantum computers. I don't think this is even remotely plausible.

It's still an open question whether quantum computers that do anything useful at all are feasible. But even if they are - we can safely assume that for a long time they will be expensive devices, running in complex physical experiments, that will be reserved for very special needs.

From all we know QCs aren't magically faster computers that are suitable to do everything better. There are just a few very special algorithms where QCs have an advantage. That's certainly interesting, but even if it would be possible - and that's a big if - it's unlikely anyone will add them to our smartphones anytime soon, simply because there's no need for that.

If quantum computers arrive we'll need a handful of specialized programmers that are familiar with their algorithms. But not masses of them.


I think the relevance of potential quantum computer programming can be compared to that of GPU programming: It's useful to solve particular problems with much more power than with classic approaches, but for most applications there's no gain. So most software will probably continue to run on classic computers as today.


Sure, but then bear in mind that with the onset of deep learning (which seems poised to be the foundation for a lot of emerging technology for at least the next decade) GPU programming has suddenly become a lot more useful.

What if, just as deep learning brought life to GPUs decades after they were invented, some other algorithm or paradigm that we’re not paying attention to now becomes huge once QCs are available to test on?


I think use of QCs will be orders of magnitude lower than use of GPUs.


I suppose people thought the same of classical computers when the ordinary way to carry out computations was to deliver them to arrays of people sitting at desks. Now, history is not forced to repeat itself, but it has happened frequently enough so that it might be premature to exclude that QC will at some point be generally available and used. I agree that having a lot of programmers is not an immediate need, though.


I remember when the only useful thing you could do with a GPU would be to play videogames. Now you can do

1. Deep Learning

2. Cryptocurrency Mining

3. CAD / CAM

4. Acceleration of certain computations like fluid dynamics


Quantum Computers will have some applications in Cryptography, I hope.

There are already some Quantum Key Exchange algorithms which can detect wiretapping by a third party reliably without requiring a PKI. If any third party is listening, the connection will simply fail as a fundamental property of nature. (Though MitM is still possible, but you have to go full, a simple wiretap is not sufficient anymore)

Another possibility are quantum algorithms that allow one to perform cryptography without having to trust the underlying device, it could even be malicious, and everything would either work without leaking or just fail without leaking. (Though IIRC research in that area is still ongoing)


QKEs have nothing to do with quantum algorithms running on quantum computers.

Also they don't solve any real problem, but that's another topic. It's an extremely impractical and expensive way to implement a key exchange over short distances which, as you said, is vulnerable to mitm attacks. I don't know why anyone would want to have that.


This paper is a good example of how much progress we've had in quantum algorithm research since the dawn Quantum information science. Here is another good survey for people who want to learn more about quantum algorithms [0].

From the abstract, it seems like a resource I wish I had when I was starting to get into quantum computation.

[0]: https://math.nist.gov/quantum/zoo/


I like the idea, but I must admit that (with my background of a mathematics researcher, but almost completely ignorant of quantum physics and quantum physicists' notations) it is rather difficult to understand. In particular the bra-ket notation (which to me is already difficult due to its breaking the obvious way to match parentheses, but this might just require some time) seems to be used inconsistently across the paper (I believe that different authors wrote about the different algorithms, and the variations are rather evident and unhelpful). There are some formulas of which I cannot make any sense (last example I found: in the definition of QFT inside Shor's algorithm, the k inside the ket ranges from 0 to N-1, while according to the introduction it should range on all the combinations of a certain number of qubits).

Also, some authors seem to take for granted some concepts (such as the properties of the quantum gates) that are not explained anywhere and not obvious to me.

Still, an interesting read.


I'm not sure if this is open to the public yet, but if you really are interested you could post your questions here:

https://quantumcomputing.stackexchange.com/


Try reading "Q is for Quantum", by the end of Part I you will understand one quantum algorithm thoroughly. All necessary gates are explained carefully and the linear algebra gets sneaked in without using bras and kets and so on...



I've started reading some quantum computing texts, and dabbled a bit with IBM's quantum computing experience, and this is where I get stuck:

I can follow the basics at the level of "gates" or circuits -- a lot of what's out there is essentially quantum assembly language or something of that sort.

But what about interfacing classical and quantum computing? Is the idea that you just run something from start to finish on a quantum computer? Or that you take the output from a classical system and input into a quantum system?

Some of the things I'd be most interested in with a quantum computer seem to require an interface with a classical computer, or at least raise the issue, which generally isn't addressed by these sorts of pieces. It's sort of assumed that you're programming whatever it is you're interested in at the level of machine/assembly code.


This is a great question! I've worked a lot on this topic and its crucial to understanding how quantum computers will be used in the near term.

While I don't have an easy answer for the high level interface, the assembly level interface between quantum and classical computing is also important. If you're interested in that then you should check out this paper where we describe Quil, an instruction set architecture for hybrid quantum/classical computing based on shared memory: https://arxiv.org/abs/1608.03355

This instruction set is the basis for Rigetti Computing's Forest quantum programming environment: https://www.rigetti.com/forest


Thanks for these links--these are interesting and probably come closest to addressing my questions of anything I've seen so far.


A few years ago I went to a seminar run by DWave and I really pressed them along similar lines that you were asking. A few of their engineers basically walked me through the concept.

As I understand it - and someone please correct me where this is wrong:

If you have designed an algorithm/program that optimizes for some minima (ex gradient descent), then you as the developer will need to map that algo into their system which basically creates a quantum topological map.

That topo is loaded into the QC and then it is run and outputs the "coordinates" of the minima.

Makes sense if you can 1. Convert your algo into something that can optimize for minima

2. Convert that into the topo space

3. Have no need for real-time results


Page 9-10:

"Fig. 2 shows the circuit that was designed to fit the ibmqx4 quantum computer. The circuit consists of state preparation (first two time slots), a Toffoli gate (the next 13 time slots), followed by the 2 | ψ 〉〈 ψ |− I operator (7 time slots), and measurement (the final 2 time slots). We use q [0] (in the register notation from Fig. 2) as the ancillary bit, q , and q [1] and q [2] as x 1 and x 2 . Note that the quantum computer imposes constraints on the possible source and target of CNOT gates. Some care must be taken to choose the appropriate qubits to represent each variable so that the necessary CNOT gates can be implemented. It is possible to circumvent some of the limitations in the source and target of the CNOT gates (i.e., the source and target can be reversed), but this requires increasing the depth of the circuit. Our initial implementation utilized a Toffoli gate that used such an approach, but the circuit was significantly deeper and the results were inferior to the results with the circuit in Fig. 2.

"Using the simulator, this circuit produces the correct answer x = (1 , 1) every time. We executed 1,024 shots using the ibmqx4 and x = (1 , 1) was obtained 662 times with (0 , 0) , (0 , 1) , and (1 , 0) occurring 119, 101, and 142 times respectively. This indicates that the probability of obtaining the correct answer is approximately 65%. The deviation between the simulator and the quantum computer is apparently due to the depth of the circuit combined with the approximate nature of the quantum computer. We note that our first implementation of this algorithm which used a Toffoli gate with a depth of 23 (compared to a depth of 13 here) obtained the correct answer 48% of the time."

"We designed a circuit that implements an instance of Grover’s algorithm for an IBM 5-qubit quantum computer. The outcome was successful in the sense that the quantum computer successfully completed the search with a probability that is appreciably greater than 50%. However, the 65% success rate that was obtained is much lower than the 100% that is obtained by the simulator. Deeper and more complex oracles would likely produce less satisfactory results, and this is in line with our experience implementing the oracle with a deeper implementation of the Toffoli gate."

I'm still not sure how measurement of quantum computing results works, and this isn't helping.


this isn't related to the paper's content but why in the world are there so many authors? i mean if it were an experimental paper i wouldn't be surprised but this is theory (or at the least exposition).


It’s a survey paper on many different algorithms so they could have each written parts of it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: