> Our work establishes an unambiguous quantum computational advantage that is infeasible for classical computation in a reasonable amount of time.
Bold statement for a paper that then only provides a guesstimate of the time needed for classical algorithms.
The argument in the paper is essentially a glorified begging of the question: the authors presuppose the optimality of direct simulation of their quantum circuits via Schrodinger-Feynman. They then show that S-F is significantly slower than their quantum machine, "proving" quantum supremacy. They fail, of course, to aargue that there is no alternate classical algorithm that can accomplish the same thing; a pitfall that has trapped multiple similar papers in the recent past.
In essence, this is just a new-age variant of the P != NP "proof" that says "well there can't be any solution faster than trying every path". Quantum computers may down the line prove themselves to be superior to classical ones, but failure of imagination is on its own no proof of classical inadequacy.
For example, take this line from IBM's refutation of Google's Quantum Supremacy experiment: "64 PiB of disk space are required for 53-qubit circuits, and 128 PiB for 54-qubit circuits. Both fit within the 250 PiB available on Summit." If you have to use over 50% of the total disk space on one of the world's strongest supercomputers, you're probably going to run into some major issues once we start hitting the 55, 56, 57, 58, etc. qubit territory. And although this is a moving goalpost, the supercomputers clearly aren't going to get better faster than adding single qubits onto the current chips. Already Rigetti is announcing an 80-qubit quantum computer, and IBM is making an 127-qubit computer.
So, IMHO we are far past the point where quantum computers have to prove themselves superior to classical ones, at least on designed tasks. I don't think we should compare failed P vs NP proofs (which, IMHO is a problem we're anywhere from 5 to 10000 years out from solving).
All this being said, I do have to agree with IBM on the value of the term "Quantum Advantage" versus "Quantum Supremacy". Quantum computers have not yet demonstrated a significant advantage in terms of cost on any problem, but I'm optimistic they will in about a decade. Probably in problems that are already naturally fit to quantum computers, like studying interactions between particles in quantum physics. Also for the record I'm a layperson with just an unusual interest in this field, so it's easily possible I've missed some incredibly important central problem that everyone is being quiet about.
i think calling them "designed tasks" is laundering what's going on. call a spade a spade - it's simulating completely random circuits. so what's the point?
>but I'm optimistic they will in about a decade.
you and every other researcher/vc/funding agency.
That doesn't mean the same thing will happen with QC, it just means that maybe we should QC some slack, and keep cheering for the incremental improvements that arrive every couple of years?
This kind of progress is amazing, I think we should start the clock on QC from either Shor's paper or maybe from the first 2 bit QC in 1998 - so that takes us to say 2060?
So well, it depends on your perspective I guess.
It also solves no problem at all. They tried to shoehorn into the paper a "CSRNG" application but it's dubious at best if you have any kind of cost constraint.
I believe there was also a second successful, but somewhat more controversial, experiment by a Chinese team.
That's a debateable claim. IBM's QC team certainly doesn't accept it, and have demonstrated that what Google did could be done by a conventional computer:
Oh, and you need a lot of storage, just a couple of more qubits and it would not have been able to fit in the 250 PiB on Summit, so the IBM argument would have been void if google had 55 qubits in their work.
There is no I in FAANG.