Tangentially, noticed some good Python libs for Quantum computing have started appearing lately (one from google, Cirq for eg) could be a natural progression (from scientific computing). Good to see Python is making a presence there as well. Looks like it's going to stay relevant for a very long time.
Quantum computing won't require much extra power that python can't provide, the only heavy processing will be circuit generation which is (as far as we can see at the moment) fine to use python for.
In the sort term though there's a big place for languages like C, C++ and Rust for things like simulations which need to be done
Yes, there are two reasons that python is an ideal tool for quantum computing libraries at the moment.
- In the NISQ era [1], circuits have limited depth and size. It doesn't matter so much which language (or even algorithm!) you use when N<1000.
- Simulating a circuit is expensive, but all the heavy lifting can be delegated to highly optimized C code. The most expensive part of Cirq's simulation is (or soon will be) a call to `numpy.einsum` [2].
I kinda feel like this has become the new "middlebrow dismissal". It seems superficially insightful, but doesn't actually say anything useful. Sure, many - hell, maybe even most - people don't need (machine learning | quantum computing | big data | a partridge in a pear tree | whatever). But plenty do.
Merely pointing out this dichotomy doesn't do anything to help people understand when they do, or don't, need those things.
How about a comment along the lines of "Here's how to know when you need to use quantum computing, and how to know when you shouldn't"?
I agree. I find comments like that one to be incredibly stifling to discussion. In my opinion there's a certain smugness to dismissals like this, and I think they don't contribute much for several reasons.
First, the fact that most of us don't need quantum computers doesn't mean we shouldn't feel inspired to learn about them. It's a thoroughly interesting subject. I don't believe I'll live to see practical quantum computers for most of the use cases they're hyped about now, but that didn't stop me from making them my research focus in graduate school.
Second, in many exceptionally well-moderated forums for critical discussion (e.g. /r/AskHistorians), there is a mandate in place that requires commenters to engage with their source material. This means it's not enough to link to something that's ostensibly accurate; you also need to critically clarify that material to make it accessible to other readers and contextually relevant. When a link is posted without that engagement, you force others to click through to decide for themselves not only why it's relevant, but why it's accurate.
Finally, it's not a novel insight. There are scores of comments repeating the same point for any number of hyped topics, from machine learning to blockchain to JavaScript frameworks to quantum computing. It's essentially a meme. But it's more insidious than a meme, because memes are obviously low effort and insufficientally novel. This is a middlebrow dismissal precisely because it appears intellectual, yet has no insightful contribution.
And this is the result: instead of discussing what might be a very interesting Python library for quantum computing, we're litigating the appropriateness of a dismissive top comment. What have we achieved?
I've also had fun playing around with StrawberryFields (https://github.com/XanaduAI/strawberryfields), a photonic quantum computing library which has some good visualizations with simulators driven by NumPy and TensorFlow. Xanadu's even made Blackbird, a language specifically for quantum.
- Drag-and-drop what-you-see-is-what-you-get UI instead of script-based. Smooths out the learning curve.
- Supports putting state displays in the middle of the circuit, so you can directly view normally-inaccessible information instead of inferring it from experience or algebra.
- Fast. It updates all displays interactively, as you edit the circuit. Very easy to experiment, e.g. just drag a gate around seeing what it does in different places.
This isn't quite a simulator, but a co-operative board game to understand quantum computing (featured on HN a few weeks back): https://entanglion.github.io/
I have a question. Please excuse my ignorance but I thought that once the world has a working quantum computer, the world as we know it will be destroyed. Since a quantum computer can solve any NP hard problem in a polynomial time, it would mean it could break any kind of crypto, any kind of security and can brute force anything. Why hasn't that happened yet since its 2018 and we already have quantum computers?
That's not how quantum computers work. They ("quantum networking") significantly increase security as you can tell whether a qubit has already been observed (eavesdropped) or not. Also, there are proven asymmetric cryptographic algorithms that work with quantum computers. Algorithms based on factorization (RSA) won't be safe any more but you still need a large amount of qubits (around 1000s - don't cite me on this please) to break them, which has not been achieved yet.
Definitely not the end of the world.
edit:// And they can not solve any NP-complete problem in polynomial time. That is a common misconception and not based on facts.
edit2:// Researchers working on quantum computers actually don't believe that they will make it mainstream (partially due to their complexity like cooling them down to near zero Kelvin) but instead be specialized systems available via the internet for rent - or something similar. More on the side of predicting complex systems like the weather than powering your smartphone. Then again, who thought the PC would make it mainstream.
First, they only work on really specific problems. They aren’t just a magic tool for brute force.
Current public-key crypto (both RSA and elliptic curve) happens to be one of those problems. However, there are systems where we don’t know how to break them with quantum computers, and it probably isn’t possible. These aren’t in wide use but have been tested in production e.g. by Google. If it becomes a problem, people can switch.
Second, actual existing quantum computers are too small to do much of anything. We are just hitting the point where they could start to become interesting. There are still engineering and theoretical challenges in making them really work.
All these quantum programming languages let you simulate a quantum computer, but doing so demands exponentially more resources as you add qubits. The advantage of a real quantum computer is that this would not be the case.
Also, keep in mind that NP hard ≠ NP complete. By saying 'solve any NP hard problem in polynomial time', you're also saying 'solve any NEXPTIME hard problem in polynomial time', which is known to be false.
A quantum computer can solve certain problems within a certain computational complexity class, which would fall in a different class on a classical computer!
Given a long enough amount of time, a classical computer can calculate everything a quantum computer can.
It is a quantum computer emulator.
Regular binary computers can model quantum computers with a smallish number of q-bits (dozens) by the time we get to a quantum computer with hundreds of q-bits all the regular computers on the planet together will not be able to simulate it.
Every turing machine can emulate every other turing machine. Different turing machines are all equivalent in what they can compute, but they can computer things at different speeds.
Both binary computers and quantum computers are turing machines.
Quantum computing (under the quantum circuit model) basically consists of repeatedly applying unitary matrices to complex vectors (the qubits). They are like your Boolean logic gates but they must be unitary, which implies they must be reversible: https://en.wikipedia.org/wiki/Quantum_logic_gate.
You can think of unitary matrices as the complex analogue of rotation matrices (https://en.wikipedia.org/wiki/Unitary_matrix), so what quantum logic gates are doing is "rotating" these vectors around in a complex space.
In classical computing nobody thinks of programming as consisting of a series of logic gates, though :-).
Should we expect more abstractions from quantum computing models in the future? Quantum data structures, common quantum operations etc? Or are quantum algorithms too different from one another, or are "the quantum parts" of most quantum algorithms very compact? (Or do we try to keep them as compact as possible because of hardware constraints?)
You'd better hope we can figure out some higher abstractions for QM, or you can kiss the even-harder problem of biological computing goodbye.
I think there will be, though. I think 99% of the problem is just that we have no hardware yet. It's historically just been too hard to develop algorithms for things we don't have the ability to run yet. I've seen it personally in evolutionary computation [1], and the recent renaissance in AI and ML I believe was largely driven by getting to the point we could actually run these computations in some reasonable period of time. The breakthroughs probably could have happened sooner, except even if you theorized about Deep Learning, nobody would have even been able to use it.
[1]: A professor of mine told a heartbreaking (to me) story of carrying around a deck of cards between several institutions as he progressed through his early academic career, running an evolutionary computation in whatever spare time he could get over the literal years. My crappy, grad-student-grade personal laptop, a cheap piece of crap even for the time (I had to permanently clock the nominally 1GHz CPU down to 500MHz just to keep the thing from burning itself up), could have done the whole thing in minutes, if not seconds. Per Dijkstra, computer science may be about computers as much as astronomy is about telescopes, but I would observe astronomy is pretty hard to work on without telescopes in the end.
The quantum computers to come in the next decade or two might be regarded as marginally more sophisticated than the way we currently regard the first mechanical and electronic computers.
The first practical applications will be those that can be readily represented by simple operations on a modest number of qbits. As people practice and learn, sophistication will grow.
Great comment. Circuits are particularly unhelpful
in describing quantum algorithms because they usually indicate a fixed problem size, and they do not provide an insight into entanglement, one of the fundamental “resources” of QC.
We need new representations and much better abstractions to get away from the low-level thinking we are currently promoting.
> [Quantum circuits] do not provide an insight into entanglement
What do you mean? Entanglement occurs whenever the state of a system cannot be factored into a product of the states of its components. Quantum circuits can definitely do that. Just take a qubit, apply the Hadamard gate to it, then CNOT it with a second qubit to get an entangled Bell state. You can see it in action here: http://demonstrations.wolfram.com/GeneratingEntangledQubits/.
Quantum circuits are also used to describe all kinds of quantum algorithms (quantum Fourier transform, Grover’s algorithm, quantum teleportation, etc).
I'm not sure about the actual intent of the grand-parent, but I think it was referring to insight that a larger audience can understand.
I have a CS degree, and I don't know what is "Hadamard", "Bell state", etc. Also your link doesn't load on Firefox Mobile Android, so it didn't help :-)
Proposing a Python library about Quantum Computing is an attempt at explaining mechanisms to a larger audience (surely it is not an easy task)
Edit : the link finally loaded on another tab while writing the comment, but you have to pay for a license of Wolf.A. to run it, I guess. That can't be arguably considered "accessible" knowledge.
Entanglement is fundamental to QC, right? So in order to be a useful visualisation of an algorithm, a pictorial representation like a circuit should give some intuition to aid understanding such as: which qubits can potentially be entangled at a given stage in the circuit, etc.