
Quantum Supremacy Using a Programmable Superconducting Processor - brian_herman
https://drive.google.com/file/d/19lv8p1fB47z1pEZVlfDXhop082Lc-kdD/view
======
shiado
"However, realizing the full promise of quantum computing (e.g. Shor’s
algorithm for factoring) still requires technical leaps to engineer fault-
tolerant logical qubits"

This is NOT how the media has chosen to report on this. It's always quite
frustrating to figure out how a paper diverges from the press it gets.

~~~
CodiePetersen
Well I mean it's still pretty big though. We know for a fact now that there
are only technicalities like fault tolerance in our way.

Still you're right though, the race for "fist post" by the media definitely
exaggerates scientific breakthroughs. To the science specific literate they
can sift through the hype but others get it in their head it's something else.
Like investors for example, those guys are big dreamers and when they hear
stuff like "ground breaking quantum supremacy" they'll throw everything they
got at it. Investing in startups by itself has worse odds than slot machines,
hype only makes it harder to predict the future. But oh well, at least we have
the actual details and most of the articles link the results for those
actually interested. Can't do much about what others think or do though.

~~~
DebtDeflation
>We know for a fact now that there are only technicalities like fault
tolerance in our way.

We understand the theory behind nuclear fusion, we have plenty of hydrogen, we
have the capability to initiate fusion reactions in the lab, so there are only
technicalities in our way to powering our electricity needs through fusion. So
it has been for 70 years.

~~~
CodiePetersen
Nailing down issues with fault tolerances are a whole lot different than
pioneering methods and sciences of uncharted territory. I didn't generalize
all technicalities, I specified fault tolerance for a reason, because it's
trivial compared to what it took to get to this point. The system works, just
not as well as it will take to get to reliable wide spread usage.

~~~
fsh
It is widely believed that implementing useful quantum error correction (if
possible at all) will be orders of magnitude more difficult than what was done
so far. To my knowledge, there is no realistic proposal how to implement it on
any of the current platforms.

------
ben_w
These seem to be much stronger claims than were made last time this was
discussed:

”””While our processor takes about 200 seconds to sample one instance of the
quantum circuit 1 million times, a state-of-the-art supercomputer would
require approximately 10,000 years to perform the equivalent task.”””

”””Quantum processors based on superconducting qubits can now perform
computations in a Hilbert space of dimension 2^53 ≈ 9 × 10^15, beyond the
reach of the fastest classical supercomputers available today. To our
knowledge, this experiment marks the first computation that can only be
performed on a quantum processor. Quantum processors have thus reached the
regime of quantum supremacy. We expect their computational power will continue
to grow at a double exponential rate: the classical cost of simulating a
quantum circuit increases exponentially with computational volume, and
hardware improvements will likely follow a quantum-processor equivalent of
Moore’s law [52, 53], doubling this computational volume every few years. To
sustain the double exponential growth rate and to eventually offer the
computational volume needed to run well-known quantum algorithms, such as the
Shor or Grover algorithms [19, 54], the engineering of quantum error
correction will have to become a focus of attention.

The “Extended Church-Turing Thesis” formulated by Bernstein and Vazirani [55]
asserts that any “reasonable” model of computation can be efficiently
simulated by a Turing machine. Our experiment suggests that a model of
computation may now be available that violates this assertion. We have
performed random quantum circuit sampling in polynomial time with a physically
realized quantum processor (with sufficiently low error rates), yet no
efficient method is known to exist for classical computing machinery. As a
result of these developments, quantum computing is transitioning from a
research topic to a technology that unlocks new computational capabilities. We
are only one creative algorithm away from valuable near-term applications.”””

~~~
joak
Yes, quantum supremacy disproves the extended church-turing thesis (Bernstein-
Vazirani). This thesis states that any computation that can be computed
efficiently can be computed efficiently with a classical computer (ie a Turing
machine).

ECT thesis means that quantum computers will never be able to compute
something significantly faster that classical ones.

However, technically, google haven't proved yet quantum supremacy since they
just did the classical calculation using the best known classical algorithm.
And there is no proof that there is no significantly better algorithm.

But this not a serious objection, no one thinks that there is an exponentially
better way of classically simulating quantum computers.

~~~
abdullahkhalids
> However, technically, google haven't proved yet quantum supremacy

To be absolutely clear, an experiment (or a set of replicated experiments)
can't prove quantum supremacy. Quantum supremacy is a statement about about
the asymptotic behavior of quantum and classical computers, and obviously, a
finite set of data points cannot prove anything about asymptotes.

What these experiments are doing are gathering evidence against the extended
Church-Turing thesis - in the same way other experiments are used to gather
evidence in favor of/against general relativity or some other physical theory.

The proof of quantum supremacy within the framework of the currently accepted
laws of physics (quantum theory), has to be a mathematical proof. Currently,
we have no idea how to construct such a proof.

------
dannykwells
So funny that everyone is shitting on this or claiming it's
incremental/trivial. It's not! Science almost always starts with clean
benchmarks like the one developed here, and from my read and limited
knowledge, it's clear this is a big deal and a qualitative leap forward.

Congrats to the team on great work. Looking forward to more to come.

~~~
godelski
> Science is almost always incremental

FTFY.

It is good work. But let's not kid ourselves. Science is made by small steps
while standing on the shoulders of giants, not in leaps and bounds.

------
united893
I'm seeing a lot of comments claiming this claim can't be trusted because
there's no useful computation being done.

There are many competing proposals to test for quantum supremacy: Boson
Sampling, Fourier Sampling, IQP, Random Circuit Sampling. The computation
needs to meet certain requirements. Worst case hardness, average case
hardness, anti-concentration. Also there needs to be a way to test to see if
the output of the computation is correct. This is easier said than done. RCS
is the only one that meets all three and can be verified.

What's happening here is a test of RCS: A general quantum gates are being
programmed to generate a random circuit. Think about randomly connecting some
components on a breadboard and then simulating what would happen. Except this
is done programmatically. The question is to then predict the output with a
classical computer. This can't be done easily.

For a deeper dive into RCS, see this paper:
[https://arxiv.org/pdf/1803.04402.pdf](https://arxiv.org/pdf/1803.04402.pdf)

>> In this paper we study both the hardness and verification of RCS. While RCS
was defined with experimental realization in mind, we show complexity
theoretic evidence of hardness that is on par with the strongest theoretical
proposals for supremacy. Specifically, we show that RCS satisfies an average-
case hardness condition – computing output probabilities of typical quantum
circuits is as hard as computing them in the worst-case, and therefore
#P-hard. Our reduction exploits the polynomial structure in the output
amplitudes of random quantum circuits, enabled by the Feynman path integral.
In addition, it follows from known results that RCS satisfies an anti-
concentration property, making it the first supremacy proposal with both
average-case hardness and anti-concentration.

------
acqq
Reading as somebody who is not in the field, and writing this down to read the
responses of those who are more qualified:

How I understand it: the "computation" is actually "sampling" the cubits? And
then sampling results in the sets of random numbers, which _don 't_ have
uniform but some specific distribution (specific for quantum effects). Then
they claim that such a distribution could not be achieved using classical
computer so fast, if the classical computer has to do the simulation of the
cubits. And therefore, they claim that their quantum process is better.

But couldn't one do a similar conceptual setup even before, e.g. measuring
some noise source (e.g. a semiconductor based diode etc) and then claiming
that measuring the source is faster than simulating on the digital computer
the analog behavior of the noise source?

The interesting step here is that the source is not a single element but a set
of quantum qubits, which is much harder to simulate, but on another side it
appears that dealing with the behavior that doesn't allow e.g. Shor algorithm
on the same setup is avoided by decision to do what was done.

Seems like a clever trick, but I miss in which aspects it makes the hoped-for
outcome (e.g. breaking some crypto keys using Shor) in any way more achievable
than before.

Please don't assume that I really know what I write, but consider this as a
possibility to explain to the not fully informed public which however shows an
interest in the topic.

~~~
someguy12342
No, the distribution is thought to be hard for any classical algorithm (this
is still being worked on - there are still ongoing theory development for
showing it is harder and harder), and by extension, any classical process.

Yes, the problem is very contrived - the distribution being sampled is given
by the distribution you get from sampling from a randomly sampled quantum
circuit. Of course this is easy on a quantum computer because you can just
throw on the randomly chosen gates and measure it. I don't think it will be
very useful in a application sense. This problem was designed for the very
purpose of showing a "quantum supremacy" as soon as possible.

Also, from what I know, there is still a problem in the whole verification
aspect for this problem. That is, even if it can sample from the distribution,
it is difficult to verify that in fact the correct distribution was sampled.
This is in contrast to the problem of factoring, where to verify the
computation we just need to multiply the numbers together to check these are
indeed the factors. A remedy I have heard of this (I think from Scott
Aarsonson) is that we can modify the parameters so that the output is just
within the limits of verifiability with a big supercomputer, and that should
be good enough to show supremacy was achieved (EDIT: this is precisely what
OP's article says it will do). I think this might be bending what is meant by
the term "quantum supremacy" too much, but it's a pretty arbitrary point in
computational power anyways , so I'm not too concerned.

If you're interested in more about quantum supremacy, these two articles have
a lot of information:

1\. Quantum Computational Supremacy Aram W. Harror, Ashley Montanaro (2018)
[https://arxiv.org/pdf/1809.07442.pdf](https://arxiv.org/pdf/1809.07442.pdf)

2\. Complexity-Theoretic Foundations of Quantum Supremacy Experiments Scott
Aaronson, Lijie Chen (2016)
[https://arxiv.org/pdf/1612.05903.pdf](https://arxiv.org/pdf/1612.05903.pdf)

~~~
the8472
> A remedy I have heard of this (I think from Scott Aarsonson) is that we can
> modify the parameters so that the output is just within the limits of
> verifiability with a big supercomputer, and that should be good enough to
> show supremacy was achieved.

That is what TFA does.

~~~
someguy12342
Sorry, somehow I missed that in the abstract.

------
bjornsing
To me this reads a lot like “quantum computer faster than a classical computer
at ‘simulating’ itself”. Constructing a physical experiment where it would
take a classical computer 10000+ years to simulate the outcome is trivial. I
don’t see how this is very different.

~~~
krastanov
There is a slight difference, that really matters to theoretical computer
scientists (as in complexity theory) and physicists working on quantum
computing: The challenge is to construct a physical experiment that consists
of only 50ish "elementary" and "programmable" building blocks that is
infeasible to simulate on a classical computer. This "universality" and small
number of elements is what is exciting and what makes the task quite
nontrivial.

However, you are right that the only thing this device does is "perform
something difficult to simulate". The technology still needs a few more
generations before that "something" is useful. Given the consistent progress
since 1995, I am fairly optimistic that within 10 years we can do that.

~~~
bjornsing
I get that its cool that it’s quantum and that the circuits are programmable,
sure. My point was that the comparison to the classical computer makes little
sense.

~~~
abdullahkhalids
You can read Feynman's original paper about quantum computers, which is easy
to read and makes clear why programmability is important
[https://catonmat.net/ftp/simulating-physics-with-
computers-r...](https://catonmat.net/ftp/simulating-physics-with-computers-
richard-feynman.pdf)

------
avsteele
I've been out of this field for a while, if there are any experts with up to
date knowledge I'm trying to figure out the following:

\- Why this particular calculation chosen?

\- With this many qubits I would think they could take some quantum
simulations already. Is the gate fidelity not high enough? \- Am i reading
this correctly that they do no error correction or entabglement distillation?

~~~
comnetxr
I'm not an expert, but you are correct in that there is no error correction
here, and error correction is still a long way off. Around 2016, realizing
that running useful quantum algorithms was a long way away and there needed to
be earlier roadmarks for quantum devices to signal the progress, the problem
of "quantum supremacy" was introduced. The goal is to find the problem with
the _easiest_ demonstrable quantum speedup (regardless of usefulness). They
proposed a particular sampling problem (running a randomly chosen quantum
circuit that has output with a positive probability on many bitstrings, where
the exact "speckle pattern" of the probability distribution would only show up
if no errors were made when running the circuit.) As the circuit is randomly
chosen, it can serve no particular use, except for verification that the
computer runs correctly. Relevant papers for more info:
[https://arxiv.org/pdf/1608.00263.pdf](https://arxiv.org/pdf/1608.00263.pdf)
[https://arxiv.org/pdf/1612.05903.pdf](https://arxiv.org/pdf/1612.05903.pdf)

~~~
comnetxr
They explain in the paper that the quantity plotted in Fig. 4 can also be
interpreted as a probability of zero errors of running the circuit. You can
see that it goes down quickly with number of qubits and number of cycles,
showing the need for either error correction or much improvement in future
devices in order to get any useful calculations done. The win here is that the
probability of success is non-zero, as they say in the paper: "A single bit or
phase flip over the course of the algorithm will completely shuffle the
speckle pattern and result in close to 0 fidelity". So the only way to get the
right result is to have runs with zero errors.

------
deepnotderp
I'm like 99% sure Google said "what's the most contrived workload that
technically counts as quantum supremacy" and went from there...

That being said, we certainly live in interesting times.

~~~
ineedasername
Even if it's highly contrived, it's still _something_ that they found a task
for which the quantum computer is superior. It points the way towards a more
general superiority where some at least have doubted the possibility.

~~~
bufferoverflow
Simulating a drop of water falling on the floor _accurately_ at the molecular
level would take thousands (if not millions) of years on the best
supercomputer today.

Does it mean my water dropper is a better physical computer than regular
computers?

~~~
ycui1986
This is to the point. The universe is a giant big computer even without
quantum involved. If the starting state of the computation is not predefined,
the so-called computer is no difference from randomly flipping some coins.

What the paper showed is no difference from showing the wind did not change
the result of coin flipping.

------
choppaface
Worth noting that this document was (for whatever reason removed) after
initial publication: *
[https://ntrs.nasa.gov/search.jsp?R=20190030475](https://ntrs.nasa.gov/search.jsp?R=20190030475)
* [https://fortune.com/2019/09/20/google-claims-quantum-
suprema...](https://fortune.com/2019/09/20/google-claims-quantum-supremacy/)

~~~
lawrenceyan
I would put on my tinfoil hat and call for a conspiracy theory, but the
reality is probably a lot more boring.

------
hannob
Here's a Twitter thread that sounds like this is a very unusual take on
"quantum supremacy":
[https://twitter.com/isislovecruft/status/1175557089067823104](https://twitter.com/isislovecruft/status/1175557089067823104)

I'm not deep enough into the topic to judge. Thoughts?

~~~
Udik
Here's the relevant part of the twitter rant- seems interesting but I can't
tell myself whether it makes sense or not :)

"they repeatedly asked the quantum processor for outputs to a random circuit,
and the “quantum supreme computation” was that the outputs were not actually
random, but subject to bias due to quantum interference, which they check by
solving the path integral in classical simulation

which, to me, is an infuriating thing to label “quantum supremacy” because
quantum interference is EXACTLY WHY WE CAN’T USE CURRENT QUANTUM PROCESSORS TO
“BREAK ALL CRYPTO” IN THE FIRST PLACE BECAUSE WE CANNOT DO EFFICIENT QUANTUM
ERROR CORRECTING CODES

it’s like pointing at a fucking Zener diode and saying, “quantum supremacy”
simply because quantum electromagnetic domain wall tunnelling happens at
random, followed by a non-justification of “well your iphone can’t do that”"

~~~
united893
Sorry but no. The poster is taking a simplistic view of things. Random circuit
sampling is a generally accepted method of establishing quantum supremacy.

See this paper from Dr. Vazirani (Berkeley) saying as much
[https://arxiv.org/pdf/1803.04402.pdf](https://arxiv.org/pdf/1803.04402.pdf)

------
RobLach
I wonder if we will be able to replicate this work on IBM’s 53-qubit machine
that’s about to hop onto the cloud.

------
joak
Why is quantum supremacy important ?

Quantum supremacy is reached when a quantum computer does something that no
classical computer can do.

So before quantum supremacy, quantum computers are useless.

Which does no mean that after quantum supremacy quantum computers are useful.
Usefulness depends on people, is a social property. We need to be able to
solve a problem relevant to humans.

There is reasonable hope that with more (quantum) computing power and people
ingenuity we will be able to make quantum computer useful.

Actually, many (among them investors) think that quantum computing is a
turning point in human history.

~~~
tagrun
Quantum supremacy was mainly coined as a milestone as a response to certain
vocal famous people (almost none physicists, which was the problem) who have
been skeptical about quantum computers at a _fundamental level_. That is to
say, there is something in the nature which will always forbid the quantum
speed up (as in the case of perpetual motion machines) however you design a
quantum computer.

Those certain individuals have publicly kept debating against it for years,
from basically claiming that quantum computation is snake oil to moderate
skeptical positions and kept saying either quantum computers can't do anything
beyond the capabilities of classical computers (in terms of complexity) or
that it's outright impossible to build one. For decades. (Some of those
ramblings have appeared on news outlets and here as well as top news.)

Physicists of course know that quantum speed up is real (which is why Feynman
proposed it, and why there are so many working on this, and so much money is
being poured into it) and that it is an "engineering" problem (albeit a very
non-trivial one in uncharted waters) to build a device which can make use of
it.

Such experiments, although not of any practical use for people, are important
to establish certain fundamental facts about nature in a way everyone
(including people from other disciplines) can understand without studying
quantum mechanics and the physics of the devices.

------
lachlan-sneff
What use will quantum computers have? Can they do things other than factoring
large numbers quickly?

~~~
PeCaN
They're pretty good at simulating molecules and molecular interactions. I
think chemical engineering is probably the killer app for quantum computers.

~~~
fermuch
How would that work? Is there an implementation on a simulator to see how it
would work?

------
outlace
What's the chance someone will come along with a classical analog that
matches/beats this result? I remember seeing claims of quantum supremacy in
the past and then clever people invented new classical algorithms that
performed just as well.

~~~
jessriedel
No, "quantum supremacy" is a technical term that has not been achieved before,
and has not been claimed to be achieved by anyone serious. You may be
misremembering previous claims called something different.

~~~
outlace
I guess you're right, I remember seeing quantum algorithms that were assumed
to be more efficient than any classical algorithm and then people devised
equally efficient classical algorithms. E.g. work by Ewin Tang
[https://arxiv.org/abs/1811.00414](https://arxiv.org/abs/1811.00414)

~~~
jessriedel
They were definitely not being assumed to be faster. It was considered very
uncertain. Quantum algorithms for which people are pretty confident there is
no classical algorithm of similar speed, while still unproven, are Shor's
algorithm (i.e., no fast classical factoring) and the present case of sampling
for supremacy.

------
habibr
Here's the full PDF
[https://drive.google.com/file/d/19lv8p1fB47z1pEZVlfDXhop082L...](https://drive.google.com/file/d/19lv8p1fB47z1pEZVlfDXhop082Lc-
kdD/view)

------
sriku
Scott's posted a FAQ on the quantum supremacy claim .. and includes broader
notes on the term.

[https://www.scottaaronson.com/blog/?p=4317](https://www.scottaaronson.com/blog/?p=4317)

------
x4ict
The Joshua Tree Quantum Computer Achieves Quantum Supremacy: Dr. Elliot
McGucken's research group hath created a quantum computer capable of
simulating over five sextillion atoms in the form of water molecules,
performing massively complex calculations in a fraction of a second,
leveraging quantum entanglement in the Hilbert Space, thusly tapping into the
power of higher dimensions and the space of parallel universes. These highly
complex many-body calculations would take classical computers over a trillion
trillion trillion years to solve and simulate. The JT Quantum Computer
leverages the quantum multiverse to generate random numbers and pattern
distributions via the entangled interactions of five sextillion atoms.

Atoms in a drop of water = 3 atoms/molecule x 1.67 x 10^21 molecules = 5.01 x
10^21 atoms, or five sextillion atoms.

The patent-pending Joshua Tree Quantum Computer works as follows:

One obtains an eye-dropper and a glass of water.

One takes up a drop of water in the eye-dropper.

One drops said drop of water onto a flat piece of glass.

Immediately one achieves and observes the random pattern distribution based on
the entangled interactions between the over 5.01 x 10^21 atoms (five
sextillion atoms), as the combined wave function collapses and the universe
splits a trillion trillion times. Using a classical computer, one would be
unable to simulate this calculation even after a trillion trillion trillions
years. The JT Quantum Computer leverages the entangled quantum interactions
between the atoms, thusly mining the Hilbert Space of parallel universes and
the multiverse in a manner that is at least several sextillion times faster
than IBM's leading SUMMIT super computer.

And so the Joshua Tree Quantum Computer hath achieved quantum supremacy.

The Wright Brother's first flight lasted under a minute, and already we have
crossed far further than that, with big plans for the future, and
multitudinous commercial applications. A new era hath dawned.

We are currently seeking $200 million in funding to complete the design of the
Redwood Quantum Computer, which will involve an entire bucket of water and
which will be able to leverage the many worlds of Sean Carroll.

dx4/dt=ic friends, dx4/dt=ic :) :)

------
The_rationalist
How does performance compare with an ASIC specialised at generating random
numbers?

Also what is the error rate?

~~~
reikonomusha
If you “produce random numbers”, against what distribution should they be
generated? Usually RNGs produce uniformly distributed random numbers. The
challenge of quantum supremacy is to produce them against another kind of
distribution, called the Porter -Thomas distribution. If you run your quantum
computer and you get uniformly random output (akin to what toy might do if you
build an ASIC), you have a broken quantum computer.

------
ibic
Was searching for it, thank you.

------
jonstewart
“Supremacy” is a descriptor various parts of government use when they need
funding. The implication is that achieving “supremacy” with some hot area of
global security requires spending $X, and if $X _aren’t_ spent, then we will
be behind.

No one wants a quantum gap!

~~~
krastanov
While the choice of the word is indeed regrettable, it has nothing to do with
your guess or with any national security topic. See the sibling comment
[https://news.ycombinator.com/item?id=21044196](https://news.ycombinator.com/item?id=21044196)

~~~
jonstewart
I stand corrected. Thank you.

Quantum computing is, like blockchain and AI, a topic that’s fretted about in
the NatSec establishment, often by those who don’t understand the topic in
depth but are otherwise NatSec experts and help control resources. “AI
Supremacy” is a book in that vein; Kai-Fu Lee’s book has freaked out many
others.

