Hacker News new | past | comments | ask | show | jobs | submit login

> It heralds the point in hardware development where quantum circuits cannot be reliably simulated exactly on a classical computer.



I mean, that was practically already the case with the Google Sycamore processor. IBM claimed that they could simulate the 53-qubit circuit in something like 24 hours on a supercomputer (with a bunch of bespoke optimizations), but a 54-qubit version would have been completely classically intractable. We didn’t need to get to 100+ qubits, and those came out before now.


Google's claim was widely criticized for being an unimportant and uninteresting benchmark, making the quantum s*premacy claim nonsense.

Later research proposes that an exascale-level supercomputer could simulate it in "dozens of seconds".

https://arxiv.org/abs/2111.03011


That's such a stupid criticism. Quantum supremacy is an arbitrary benchmark by definition.


This is a funny comparison.

"Here is this device with 50 components. It can be simulated by a device with 1000000000000 components, so we should not really be impressed."


Truth. Not to mention the stunning asymmetry in energy usage. At a minimum, if we can solve an equivalent computational problem using a quantum computer with orders of magnitude less energy than a classical HPC, QC merits consideration. The carbon footprint of data centers is far from negligible.

Computational advantages aren't the only types advantages we should care about.


We can barely simulate a rough approximation of the brain of a worm. The world is filled with many things more impressive than early examples quantum computing.


That is a very good counter-argument, in the style of this comics I really like: https://www.smbc-comics.com/comic/2013-07-19

The big difference is that these early quantum devices (non-scalable noisy quantum computers) are *programmable* and *universal*. It is the difference between an analog computer that can simulate one thing of fundamentally bounded size and digital computers that can simulate "anything" with *in principle* unbounded size.


When it costs a few dollars to make the latter and tens of millions to make the former then it’s fair to not be super impressed.


You are off by 9 orders of magnitude at least. The classical super computer in these comparisons costs 0.3 Billion dollars and this does not count the many Trillions that took to develop the tech.

Even on this measure, the (useless for now) quantum tech wins.


I was assuming it was transistor count to qubit count, didn’t count the 0’s but either way not 9 orders of magnitude off so not sure what exactly you are saying, but either way all I am saying is that it isn’t impressive to a lot of people precisely because it is useless right now. So you are off by 100% in my opinion, way more than 9 orders of magnitude ;)


The people that have created these devices never claimed that they can be used for any useful computation, neither did the people you are talking to here. However, these technology demonstrators do show a programmable computation (sampling from a particular probability distribution) that is infeasible on anything but a supercomputer and becomes just impossible once you add a couple more qubits.

Sure, we do believe these devices, when made more reliable, will also do "useful" computations that are infeasible on supercomputers, but we are aware that we need to build the devices first in order to convince you.

9 orders is the difference between a few dollars and a billion dollars.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: