Hacker News new | past | comments | ask | show | jobs | submit login
Engineering tour de force births programmable optical quantum computer (arstechnica.com)
58 points by rbanffy on Sept 6, 2018 | hide | past | favorite | 12 comments



> The [two qubit] gates are about as reliable as any others you will find in the quantum computing world, which is to say that operations complete successfully around 93 percent of the time. For comparison, ion-based quantum computers are at 95 to 99 percent and superconducting quantum computers are around 90 to 95 percent. [...]

Those comparison numbers are incorrect. Ion-based and superconducting-based groups have reported two-qubit gate fidelities of 99.9% [1] and 99.4% [2] respectively.

[1]: https://arxiv.org/abs/1512.04600

[2]: https://arxiv.org/abs/1402.4848


I don't know if there's a sensible way to define "typical" gate fidelities, and I agree "90 to 99 percent" is misleadingly low, but picking some of the highest reported numbers (possibly achieved in specialized non-scalable set ups) is also not very representative. The engineering of gate quality is too messily intertwined with the rest of the device.


The article is also reporting numbers for a one-off setup that has scaling challenges, so it seems appropriate to compare against that type of number.


Fair enough.


I like optical quantum computing, it feels hands on, like it's almost something you could build in your garage. Linear optical quantum computers just need waveguides, beam splitters, phase shifters and mirrors (no disrespect - I know it's not trivial to make work). You set up your circuit and fire your light through it and measure the system at the end.

It's a nice way to think about quantum computing. You aren't allowed to do any measurements half way or you will destroy your superposition. It's a bit like a pure function, no IO allowed until the very end.

Also I know three of the people on the paper, good to see them getting some mainstream attention for a nice result.


> You aren't allowed to do any measurements half way or you will destroy your superposition.

This is true of all quantum computations. It doesn't matter what the underlying hardware is.


It's actually extremely common for quantum algorithms to have measurement operations halfway through. But they apply to individual qubits, not the whole system.

For example, error corrected quantum computation involves continuously measuring particular stabilizers in order to catch when they flip. Another example: measurement can reduce the cost of uncomputation (e.g. [1]).

[1]: https://quantum-journal.org/papers/q-2018-06-18-74/


Yes you are right, I wasnt clear, just that it seems nicer to visualise as a physical thing in an optical circuit.

Also remember that you have to perform measurement multiple times. It produces a probability distribution, so you need to be confident you have the right answer.


> This is true of all quantum computations.

And also for half of my non-quantum debugging...


I also like photonic quantum computing.

Unfortunately the success rate of two photon quantum gates is crushingly low.


Bingo, straight across the middle. See that, I never thought I was going to get optical and quantum in the same headline but there you go. You never know with bullshit bingo.


A cool implementation of a deep ANN using a similar system: https://arxiv.org/pdf/1610.02365.pdf

(the nonlinear units and the backprop are done on a classical computer).

The paper also points out that thermal cross-talk between the thermo-optical phase shifters can limit gate fidelities or conversely the smallest spacing between the waveguides.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: