
Quantum Supremacy Using a Programmable Superconducting Processor - ilamont
https://ai.googleblog.com/2019/10/quantum-supremacy-using-programmable.html
======
dang
There are three threads. The others are:

IBM's critique
[https://news.ycombinator.com/item?id=21333105](https://news.ycombinator.com/item?id=21333105)

Scott Aaronson's blog
[https://news.ycombinator.com/item?id=21335907](https://news.ycombinator.com/item?id=21335907)

~~~
jonbaer
rdv's critique [http://rdvlivefromtokyo.blogspot.com/2019/10/googles-
quantum...](http://rdvlivefromtokyo.blogspot.com/2019/10/googles-quantum-
supremacy-paper.html)

------
rkrzr
> This result is the first experimental challenge against the extended Church-
> Turing thesis, which states that classical computers can efficiently
> implement any “reasonable” model of computation. With the first quantum
> computation that cannot reasonably be emulated on a classical computer, we
> have opened up a new realm of computing to be explored.

Note, that the "extended Church-Turing thesis"[1] says something very
different than the actual "Church-Turing thesis" (and is also not due to
either Church or Turing).

The original Church-Turing thesis is not challenged at all by these
experiments. Indeed, it has been shown that quantum Turing machines are not
computationally more powerful than deterministic Turing machines - as in: any
program that can be computed by a quantum Turing machine can in principle also
be computed by a deterministic Turing machine (it just might take a little/a
lot longer).

The extended Church-Turing thesis is concerned with the performance of a
computation and says something about the "efficiency" of the computation.

The original thesis is concerned with what's _possible_ , while the extended
thesis is concerned with what's _feasible_.

[1]
[https://en.wikipedia.org/wiki/Church%E2%80%93Turing_thesis#V...](https://en.wikipedia.org/wiki/Church%E2%80%93Turing_thesis#Variations)

------
oliveremberton
Interesting refutation by IBM here:

 _In the preprint, it is argued that their device reached “quantum supremacy”
and that “a state-of-the-art supercomputer would require approximately 10,000
years to perform the equivalent task.” We argue that an ideal simulation of
the same task can be performed on a classical system in 2.5 days and with far
greater fidelity._

[https://www.ibm.com/blogs/research/2019/10/on-quantum-
suprem...](https://www.ibm.com/blogs/research/2019/10/on-quantum-supremacy/)

~~~
phh
IMO the whole IBM blog is interesting, your quote isn't just a summary.

Basically what IBM is claiming is that Google's circuit doesn't do anything
useful really, it is just meant to be very complicated to do in traditional
computing systems. And even in this idealized position, Google's quantum
computer doesn't transform an unsolvable problem into a solvable one.

According to IBM, "Quantum Supremacy" means accomplishing something useful to
the society, that simply wouldn't be possible to do otherwise. Google's
article doesn't show any sign of that, IBM claim.

~~~
afthonos
"Quantum Supremacy" is a technical term, not a colloquial one. It refers to
showing that _there exists_ a problem that a real, physical quantum computer
can solve quickly that a classical computer cannot.

Reading the IBM article, they are fully aware of what "quantum supremacy"
means in a technical sense, and they are urging _the media_ not to use that
term, since it will be misunderstood by the public. Their claim that Google
has failed to achieve supremacy rests solely on their claim that they can
simulate the circuit far faster (and scale the simulation linearly) using
better classical algorithms.

That's a strong claim, and I'm interested in seeing what Google responds with.

Disclosure: I work at Google, but hahaha, no, I'm not cool enough to work on
this.

~~~
marktangotango
> Disclosure: I work at Google, but hahaha, no, I'm not cool enough to work on
> this.

I often wonder why people feel compelled to write this. There nothing in your
profile or in your post history to prove unequivocally you do or don’t work
anywhere in particular so why even mention it?

Edit; responders are citing ethics and company policy; but does anyone really
think vague hand wavey speculation on a public message board is relavant to
anthing?

Sure if you work in the ads group and you posted "wow, our division is in
trouble, competitor Z is really killing it, and our quarter earnings are going
to miss big time!" Yeah that matters, but GP? Really?

~~~
otterley
First, it’s a professional ethics requirement. If there is the possibility of
a conflict of interest in our statement, we need to disclose it to avoid
misleading anyone.

In addition, there are plenty of ways to figure out where an HN member is
employed besides looking at their post history. It’s not necessarily found in
publicly available data, but it can be found.

Also, you have to account for future acts. Disclosing early can help insure
against accusations about misconduct if you were to out yourself — or someone
were to out you — later.

------
comnetxr
There seems to be a lot of confusion here about the complexity, for example
calling the simulation here "in the linear regime" due to IBMs result. This is
inaccurate.

A source of this confusion is that we need to discuss space and time
complexity simultaneously. In the algorithms for quantum simulation (and many
other algorithms), there is a trade off between space complexity and time
complexity. ELI5: You don't have to store intermediate results if you can
recompute them when you need them, but you may end up recomputing them a huge
number of times.

For the quantum circuit, the standard method of computation gives exponential
memory complexity in number of qubits (store 2^N amplitudes for a N qubit
wavefunction) and time complexity D2^N, i.e. linear in circuit depth and
exponential in number of qubits. For example, under the IBM calculation, 53
qubits at depth 30 use 64 PB of storage and a few days of calculation time,
while 54 qubits use 128 PB and a week in calculation time. Adding a qubit
doubles the storage requirements AND the time requirements.

Under google's estimation of the run time, they were using a space time memory
tradeoff. There is a continuous range of space-time memory tradeoffs - USE MAX
MEMORY as IBM does, MAX RECOMPUTATION (store almost no intermediate results,
just add each contribution to the final answer and recompute everything) and a
range of in-between strategies. While I don't know the precise complexities,
the time-heavy strategies will have time complexity exponential in both N and
D ( 2^(ND) ) and space complexity constant.That's why googles estimate for
time complexity is so drastically different than IBMs.

Side note: IBM also uses a non-standard evaluation order for the quantum
circulation, which utilizes a trade-off between depth and number of qubits. In
the regime of a large number of qubits, but a relatively small depth, you can
again classically simulate using an algorithm that scales N*2^D rather than
D2^N, using a method that turns the quantum circuit on its side and contracts
the tensors that way. In the regime of N comparable to D, the optimal tensor
contraction corresponds to neither running the circuit the usual way or
sideways but something in between. None of these tricks fundamentally change
the ultimate exponential scaling, however.

As an extra step, you could also run compression techniques on the tensors
(i.e. SVD, throwing away small singular values), to make the space-time
complexity tradeoff into a three-way space-time-accuracy tradeoff. You
wouldn't expect too much gain by compression, and your accuracy would quickly
go to zero if you tried to do more and more qubits or longer depth circuits
with constant space and time requirements. However, the _real_ quantum
computer (as Google has it now, with no error correction) also has accuracy
that goes to zero with larger depth and number of qubits. Thus, one can
imagine that the next steps in this battle are as following: If we say that
Google's computer has not at this moment beaten classical computers with 128PB
of memory to work with, then google will respond with a bigger and more
accurate machine that will again claim to beat all classical computers. Then
IBM will add in compression for the accuracy tradeoff and perhaps again will
still beat the quantum machine.

So this back and forth can continue for a while - but the classical computers
are ultimately fighting a losing battle, and the quantum machine will triumph
in the end, as exponentials are a bitch.

------
OskarS
I don't understand nearly enough of this stuff, but this section sounds...
weird:

 _Each run of a random quantum circuit on a quantum computer produces a
bitstring, for example 0000101. Owing to quantum interference, some bitstrings
are much more likely to occur than others when we repeat the experiment many
times. However, finding the most likely bitstrings for a random quantum
circuit on a classical computer becomes exponentially more difficult as the
number of qubits (width) and number of gate cycles (depth) grow._

It sounds like they're saying "simulating this physical process accurately is
extremely computationally expensive, but just running the physical process in
the real world is MUCH faster". But that's true of basically any physical
process!

Simulating turning on a single lamp in a room and figuring out how much light
falls on what walls is an enormously computationally expensive process, which
can take immense amount of time to figure out (depending on your desired level
of accuracy). Movie studios, CG artists and game companies build entire render
farms to solve this problem. But the physical process itself (just turning on
the light) happens instantly. Just replace "turning on the light" with
"running a quantum circuit".

~~~
taejo
The lamp is not a programmable light-processor that can compute arbitrary
computation using light, but Sycamore is (though it's very small).

The trouble is we don't know many (any?) interesting algorithms to run on such
processors. We can factor prime numbers using Shor's algorithm, but the
numbers we can factor with 54 qubits are so small that our best non-quantum
algorithms on our best non-quantum computers far outperform them. No supremacy
is demonstrated. It's only when one takes problems that are ideally suited to
run on this processor that it outperforms classical computers.

Even with the deck stacked this way, it's taken until 2019 for it to be done
for the first time! It's a very, very early milestone.

~~~
OskarS
That does make sense, but given that the task is not solving anything
"outside" the physical process itself, it still feels unfulfilling. Using my
(maybe wrongheaded) analogy: it feels like "we built a lamp to simulate what
happens when we turn on a lamp".

Quantum "supremacy" only feels like a real thing when a quantum computer can
solve a problem that isn't directly tied to the simulation of itself. That's
my feeling, anyway.

BTW: this is not to disparage the work itself. This seems like great research,
and great progress in this field. Good on Google and the researchers!

~~~
taejo
I agree that the term "quantum supremacy" is overly grandiose, but it was
clearly defined _before_ Google published this work.

I just want to emphasise that there were people who believed that _even this
small step_ was impossible. Prominent quantum-computing skeptic Gil Kalai
believes that scalable quantum computing is impossible in principle, and
believes that this paper didn't achieve what it claims, _but_ he says that his
view would be refuted if they did! (This is in contrast to IBM, who isn't
criticizing the quantum side of the experiment, but the comparison to
classical computing; and to the Internet commentariat who are saying that even
if they did everything they claim, it's not interesting).

------
ArtWomb
Last line of the paper is destined to resonate into history: "We’re only one
creative algorithm away from valuable near-term applications"

Reminds me of the last line from Watson and Cricks 1953 Structure of DNA
paper: "It has not escaped our notice that the specific pairing we have
postulated immediately suggests a possible copying mechanism for the genetic
material" ;)

I bring up genomics specifically because the main profit driver appears to be
biotech. Predicting near-instantaneously whether regulatory proteins bind to a
target genome site could revolutionize drug discovery.

And while full "programmatic" control of transcription factors may be dozens
of years away. In the near term, annealing based ML can speed up analysis of
genome-wide datasets

Unconventional machine learning of genome-wide human cancer data

[https://arxiv.org/abs/1909.06206](https://arxiv.org/abs/1909.06206)

------
charriu
Relevant blog post about the leaked preview by Scott Aaronson:
[https://www.scottaaronson.com/blog/?p=4317](https://www.scottaaronson.com/blog/?p=4317)

~~~
m3adow
Waybackmachine Link, as the site seems to be down:
[https://web.archive.org/web/20191015112241/https://www.scott...](https://web.archive.org/web/20191015112241/https://www.scottaaronson.com/blog/?p=4317)

------
brown9-2
Do I understand correctly that the experiment itself has to do with simulating
quantum circuits using a classical computer, and showing that a real quantum
computer can produce the result much faster than a classical computer
simulating a quantum one?

It’s hard to parse what the actual thing being computed is.

~~~
Frost1x
That is my interpretation but I need a more detailed paper because everything
is quite vague. Based on what I read, the claim isn't that great (yet).
Everyone is gunning to be the historical first here.

"Hey, we have a new fluid flow simulation processor that reaches CFD supremacy
over classical computers... we pour water across the surface of things and the
results are indistinguishable to reality compared to the days, weeks, months
it would take to simulate and still wouldn't reach the same resolution of
accuracy on a classical computer."

~~~
comicjk
It's not quite as bad as an analog water simulator made of water. They have
built a programmable machine that can run any quantum problem it can load into
its qubits. The problem is, it has so few qubits that it can only load tiny
problems. The claim is that it can still beat classical computers on some
subset of these tiny problems.

~~~
brown9-2
Is it correct that the claim “that it can still best classical computers on
some subset of these tiny problems” based on classical computers simulating
quantum circuitry to compute those problems?

I’m wondering if there is some other way the problem could be computed without
having to simulate qubits.

~~~
comicjk
A lot of work has gone into looking for classical algorithms for these
sampling problems since the mid-2000s, and we have found some speedups, but
everything still suggests that they are truly exponential on classical
hardware. The classical algorithms for these problems don't involve modeling
qubits directly, but they do involve sampling exponentially many states and
averaging over the results, which is a lot like modeling qubits.

------
vtomole
Cirq has most of the experiments that were used to demonstrate quantum
supremacy[0]. Here is an example of cross-entropy benchmarking; the tool used
to benchmark this processor:
[https://github.com/quantumlib/Cirq/blob/master/examples/cros...](https://github.com/quantumlib/Cirq/blob/master/examples/cross_entropy_benchmarking_example.py)

[0]:
[https://github.com/quantumlib/Cirq/tree/master/cirq/experime...](https://github.com/quantumlib/Cirq/tree/master/cirq/experiments)

------
legatus
Associated video:
[https://www.youtube.com/watch?v=-ZNEzzDcllU](https://www.youtube.com/watch?v=-ZNEzzDcllU)
[4:42 mins]

------
sphix0r
The paper:
[https://www.nature.com/articles/s41586-019-1666-5](https://www.nature.com/articles/s41586-019-1666-5)

------
radarsat1
I just posted this on reddit, but I'm curious about Hacker News' take on it,
so apologies for the cross-posting:

Can I ask an honest question? Shouldn't people be scared shitless of quantum
computing?

If a single organization develops the ability to factor prime numbers at
unheard-of speeds, doesn't this put in jeopardy the mathematical backbone of
all encryption, and therefore don't they pose a threat to all financial
institutions? Don't they then have the power to hold the world at ransom?

I don't mean this as a conspiracy theory -- I don't think an actual "event" is
even necessary to pose such a threat. If it's proven that theoretically sound
classical encryption can be theoretically cracked in record time by quantum
computers, I am afraid of what will happen to the world financial markets due
to panic. Nevermind the first time it is actually used to steal billions of
dollars, or some other large-scale event.

I'm no expert on quantum computing, please enlightenment if indeed Shor's
algorithm does not pose a threat to privacy or to the financial markets. It
would help me sleep at night. For now, I do think the idea is not totally
unfounded [1] and therefore quantum computing, for all the positives (eg fast
solves for hard optimisation problems, which would have massive
applicability), scares the living daylights out of me.

[1]
[https://web.archive.org/web/20121115112940/http://people.ccm...](https://web.archive.org/web/20121115112940/http://people.ccmr.cornell.edu/%7Emermin/qcomp/chap3.pdf)

~~~
hannob
> If a single organization develops the ability to factor prime numbers at
> unheard-of speeds, doesn't this put in jeopardy the mathematical backbone of
> all encryption, and therefore don't they pose a threat to all financial
> institutions?

Two things:

1\. It's extremely unlikely that this will happen overnight, and practical
factoring is still pretty far away compared to today's quantum computing
state.

2\. There's work done on developing post-quantum-cryptography (which is in
essence all cryptography that we believe cannot be broken by a QC). NIST, the
US standardization organization, is currently running a competition for future
standards. It has some challenges, but it's doable.

That still leaves the issue that past-time communication may be decrypted in
the future. But other than that - we will almost certainly have alternatives
when QC factoring becomes real.

~~~
radarsat1
Very interesting, thanks, I didn't know it was an active research area,
although that completely makes sense.

> That still leaves the issue that past-time communication may be decrypted in
> the future.

Honestly I didn't even think of this. Pretty freaky nonetheless.

~~~
dodobirdlord
Indeed. The best time to stop using RSA was decades ago, but now's also a good
time. It's speculated that one of the reasons the NSA is storing so much
encrypted data is that they think the contents will still be worth knowing
many years from now when they can finally decrypt it.

~~~
yyyk
"The best time to stop using RSA was decades ago..."

There are arguments against using RSA, but I don't see how Quantum Computing
is one of those. The common elliptic curve alternatives are even more
vulnerable to QC than RSA, while the new QC-resistant algorithms aren't
accepted yet.

So if QC is your concern, you have algorithms worse than RSA on one corner,
and not-entirely-tested algorithms on the other.

~~~
hannob
There's an argument to be made that maybe we'd be using Ntru by now if its
inventors hadn't decided to file patents for it. Lucky for us the patents are
expired now.

Patents are the most reliable way to make sure your crypto isn't used widely
for 20 years...

~~~
yyyk
Not just crypto. Arithmetic coding, known to have better compression than
Huffman, is almost unused due to patents.

------
bjornsing
Anybody understand the details of how they compute the F_XEB fidelity
estimator?

I'm very perplexed that the main article states that "𝑃(𝑥_𝑖) is the
probability of bitstring 𝑥_𝑖 computed for the ideal quantum circuit", but
Supplementary Information section IV.C seems to argue that if the "qubits are
in the maximally mixed state" (i.e. the quantum computer doesn't work) then
"the estimator [𝐹_𝑋𝐸𝐵] yields zero fidelity" since 𝑃(𝑥_𝑖)=1/2^𝑛 for every 𝑖 in
this case. To me this doesn't make sense, since P(x_i)=1/2^n is clearly the
probability mass function of the empirical non-ideal quantum circuit output
distribution (and not the ideal one).

I've posted a question on the QC stack exchange:
[https://quantumcomputing.stackexchange.com/questions/8427/qu...](https://quantumcomputing.stackexchange.com/questions/8427/quantum-
supremacy-some-questions-on-cross-entropy-benchmarking)

~~~
bjornsing
Turns out the reasoning in supplementary information IV.C is kind of flawed,
but the results follow trivially from the definition of expectation and
probability mass function. Wrote it up here:
[https://quantumcomputing.stackexchange.com/a/8565/8704](https://quantumcomputing.stackexchange.com/a/8565/8704)

------
Beltiras
I've never ever seen such a brutal sentence in science before:

"In summary, unlike the objections of the IBM group, so far I’ve found Gil’s
objections to be utterly devoid of scientific interest or merit."

Yikes.

------
vasili111
"Quantum supremacy is the potential ability of quantum computing devices to
solve problems that classical computers practically cannot." From:
[https://en.wikipedia.org/wiki/Quantum_supremacy](https://en.wikipedia.org/wiki/Quantum_supremacy)

IBMs responce: "We argue that an ideal simulation of the same task can be
performed on a classical system in 2.5 days and with far greater fidelity.
This is in fact a conservative, worst-case estimate, and we expect that with
additional refinements the classical cost of the simulation can be further
reduced." From: [https://www.ibm.com/blogs/research/2019/10/on-quantum-
suprem...](https://www.ibm.com/blogs/research/2019/10/on-quantum-supremacy/)

~~~
j1vms
> same task can be performed on a classical system in 2.5 days and with far
> greater fidelity.

Currently at a _much_ greater cost in time and space, which is why they are
saying "can be performed" not "has been performed".

> we expect that with additional refinements the classical cost of the
> simulation can be further reduced

Any such refinements would be certainly interesting to see, as they would
probably be very hard won.

------
riazrizvi
Worst case scenario, my password is going to be about 24 characters long, and
we’ll all be switching to 4096-bit encryption, right? Is this technology
really going to functionally change encryption or is it just going to be like
the switch to 64-bit computing, mostly a side-story?

------
jwilk
Archived copy, which can be read without JS enabled:

[https://archive.is/Mly19](https://archive.is/Mly19)

------
breatheoften
Linear Cross-Entropy Benchmark is the framework used to try and estimate the
accuracy of the random-circuit sampling -- to try and determine how likely it
is that the random circuits are being executed correctly on the quantum
computer.

From Scott Aaron's blog post about this paper:
[https://www.scottaaronson.com/blog/?p=4372](https://www.scottaaronson.com/blog/?p=4372)
\-- he shares some thoughts on whether this benchmark is itself easy or hard
to simulate on a classical computer.

> Ah, but at the end of the day, we only believe that Google’s Sycamore chip
> is solving a classically hard problem because of the statistical test that
> Google applies to its outputs: the so-called “Linear Cross-Entropy
> Benchmark,” which I described in Q3 of my FAQ. And even if we grant that
> calculating the output probabilities for a random quantum circuit is almost
> certainly classically hard, and sampling the output distribution of a random
> quantum circuit is almost certainly classically hard—still, couldn’t
> spoofing Google’s benchmark be classically easy?

He goes on to supply an argument that its a hard test to spoof classically.

I'm wondering though -- has anyone addresses how hard it would be to spoof
this benchmark using a collection of smaller (potentially non-scalable)
quantum computers ...? E.g -- if it were possible to spoof the benchmark with
a smaller quantum computer -- would that undermine the argument in the paper
that this benchmark is evidence of quantum computing scaleability ...?

------
derpherpsson
LOL.

I wrote about half of the wikipedia article that Google is using to explain
how their quantum computer works
([https://en.wikipedia.org/wiki/Quantum_logic_gate](https://en.wikipedia.org/wiki/Quantum_logic_gate))

Should probably had done that using a pseudonym, and not just anonymously
using my IP address.

But at least the IP address I used for the last weekends work is my homes
static IP, that almost never change.. But still. I need an account at
wikipedia!

------
ouid
I'm a little worried about something in quantum computing.

If I treat my quantum computer as just a channel for sending a message to
myself I think I can invoke the channel capacity theorem. Furthermore, I think
that if I try to read an entangled state, it's as if I am reading the entire
state all at once, so the maximum number of bits I should be able to read is
bounded by the logarithm of the ratio of the signal to the noise, where noise
scales with temperature. That number doesn't scale well. It suggests that if I
want to read an additional bit from a quantum computation, I need to drop the
temperature by half.

Furthermore, I introduce heat to my quantum computer when I input the state. I
must construct the input to my quantum computer from a uniform mixture by the
no cloning principle. This is going to generate heat, and the rate at which I
can dissipate that heat is bounded by the speed of light, This gives me an
upper bound on my temperature of O(1/t^3) assuming extremely optimal heat
dissipation. This means that the time required to cool my state to the point
where I can read n entangled bits is exponential in n, right?

~~~
comnetxr
No. I think this is where your logic is off: "Furthermore, I think that if I
try to read an entangled state, it's as if I am reading the entire state all
at once"

You can't read the entire state. A quantum channel where you send n-bits lets
you read n-bits, not exponential in n-bits.

There is indeed reduction of signal as you increase temperature. Under the
error correction theorem, you can amplify that signal back to full strength as
long as you are under a threshold temperature (but with ever-increasing
resources as you approach the threshold temperature.)

Also this: "This is going to generate heat, and the rate at which I can
dissipate that heat is bounded by the speed of light". Sure, the heat has to
be transferred out of the system, which takes time that is bounded by some
distance divided by the speed of light. But the qubit is on a 2D chip
surrounded by a 3D refrigerator. The distance doesn't necessarily increase
with increasing qubits.

~~~
ouid
>You can't read the entire state. A quantum channel where you send n-bits lets
you read n-bits, not exponential in n-bits.

An entangled state is different, Since I can read all of the bits outside of
each other's light cones, they cannot causally influence each other, which is
why I said _as if_ instead. There is no such thing as "all at once", but I can
properly read each qubit before all of the others, as far as that qubit is
concerned and it would still obey the correlation. The reading of an entangled
quantum state is indeed a single measurement.

>Under the error correction theorem, you can amplify that signal back to full
strength as long as you are under a threshold temperature (but with ever-
increasing resources as you approach the threshold temperature.)

2) I don't think that quantum error correction ultimately solves the problem
of heat. It tries to solve the problem of single bit/sign flips from not doing
your computation in a closed system. My argument assumes that the only things
that exist in the entire universe are the computer and the heat.

"This is going to generate heat, and the rate at which I can dissipate that
heat is bounded by the speed of light". Sure, the heat has to be transferred
out of the system, which takes time that is bounded by some distance divided
by the speed of light. But the qubit is on a 2D chip surrounded by a 3D
refrigerator. The distance doesn't necessarily increase with increasing
qubits.

3) you're misinterpreting my statement of the problem. I don't have to get
each additional qubit just as cold as the original. Every qubit I add requires
me to get the entire quantum state to half the temperature that I had
previously. The geometry of your refrigerator here only matters in that it's
finite dimensional.

------
RivieraKid
I've just skimmed the Wikipedia article about quantum computing... and am I
wrong in thinking that it is not a big deal from a purely practical
perspective? Some cryptography would have to be updated and a few specialized
problems would become feasible.

To be honest, after the hype I was expecting there would be more applications.

~~~
mirekrusin
Optimising neural networks has great potential for quantum computers.

~~~
vamos_davai
Wouldn't it make more sense to optimize memristive technologies?

------
jaakl
I see it very similar to 20th century space tech. Yes, they are technically
providing something unthinkable otherwise. But what is their real utility?
Most if not all the biggest civil utilities like remote sensing, radio
communication and navigation are quite well doable also without them. They
fail to deliver the biggest dream of trans-galactic travel and there is no
even roadmap to it. It does not help much our everyday ground-based
transportation. So in this century we start to think: maybe this is just big
waste of energy, which now will need to be spent to resolve way bigger real
issues. Like to keep our current planet inhabitable.

------
readams
Scott Aaronson's post on the latest announcement. Also covers the IBM and Gil
Kalai objections.

[https://www.scottaaronson.com/blog/?p=4372](https://www.scottaaronson.com/blog/?p=4372)

------
kensai
I kind of thought of Civilization: Beyond Earth when first heard about the
term “Quantum SUPREMACY”.

[https://civilization.fandom.com/wiki/Supremacy_(CivBE)](https://civilization.fandom.com/wiki/Supremacy_\(CivBE\))

------
dooglius
There was some criticism of an early leaked draft (e.g. [0]), can anyone
comment if the issues raised have been addressed?

[0]
[https://news.ycombinator.com/item?id=21167368](https://news.ycombinator.com/item?id=21167368)

~~~
sanxiyn
The most important update is this, which was not in the draft: "The datasets
generated and analysed for this study are available at our public Dryad
repository".

In coming days, I am sure many will perform the independent analysis of these
datasets.

------
puranjay
Is there any site where I can learn about quantum computing without feeling
like a dumbass?

~~~
dawg-
[https://www.ibm.com/quantum-computing/learn/what-is-
quantum-...](https://www.ibm.com/quantum-computing/learn/what-is-quantum-
computing/)

------
hhs
Useful, thanks. Does anyone know if the authors will do a q&a or ama?

------
lanestp
For people more plugged in to quantum, I have a genuine question. Where did
the term "supremacy regime" come from?

------
person_of_color
Looks like I've missed the boat on this one. :)

------
kerng
Didnt IBM just dispute these claims?

~~~
317070
Yes, but a lot of people consider the disputes correct, yet irrelevant to the
main point. Moreover since their argument is that the computation could in
fact be executed on current day's hardware, but they have not been able to do
it themselves (as it would be prohibitively expensive, which is kind of
confirming Google's point).

See e.g.
[https://www.scottaaronson.com/blog/?p=4372](https://www.scottaaronson.com/blog/?p=4372)

------
pknerd
OK but `npm install` will still take a century to install dependencies.

------
toroszo
It's not "computing" if it's not Turing complete

------
Kaibeezy
_Our machine performed the target computation in 200 seconds, and from
measurements in our experiment we determined that it would take the world’s
fastest supercomputer 10,000 years to produce a similar output._

HFS!

~~~
josh2600
Sounds like that’s something that would be really hard to prove :/.

~~~
Kaibeezy
Another article said IBM scientists were not impressed and estimated it would
only take 2.5 years. But still, right?

~~~
ForHackernews
IBM is claiming 2.5 _days_ not years, apparently.

~~~
Kaibeezy
days, comma, one of those _sigh_

------
nickthemagicman
How fucked is https and cryptography if this turns out to be true?

~~~
skywhopper
This is nowhere close to breaking PKI algorithms. The quantum algorithm that
can break PKI (Shor’s algorithm) will require processors of hundreds or
possibly thousands of qubits. This 53-qubit processor is the current state of
the art, and there are huge engineering and interference challenges to
overcome as the qubit count increases.

As for where we are with Shor’s algorithm, it works by figuring out the prime
factors of an integer (which is how RSA keys are constructed). In 2001, IBM
factored the number 15 (into 3 x 5) using a quantum computer with 7 qubits. In
2012, they were able to factor 21 into 3 x 7, which is currently the largest
number factored using Shor’s algorithm.

Given that cryptographic keys run into the hundreds and thousands of bits,
Shor’s algorithm isn’t going to be a threat anytime soon. But note, too, that
it’s focused on RSA, which is the original PKI algorithm. Newer algorithms
exist which aren’t threatened by Shor’s algorithm, and work is already
underway to develop new quantum-resistant PKI algorithms.

We’ll hear about it when low-bit RSA is easily cracked. And for several years
it will be good enough to just increase bit lengths to stay ahead of it. Given
all of that, I’d expect it’ll be 20-30 years before quantum computers are a
threat to today’s PKI. And by then, PKI will be 20-30 years on from what it is
now.

~~~
fsh
It is also worth noting that breaking RSA with Shor's algorithm requires
thousands of _error-corrected_ qubits. This corresponds to millions of
physical qubits with typical fidelities and nobody knows how to scale a system
to that size.

------
hervature
Without having read the paper, is the claim that it would take 10,000 years to
emulate the results on a classical computer?

~~~
lake_vincent
I wouldn't say "emulate", because a classical computer wouldn't be trying to
solve it the same way as the quantum computer. The point (which would be
amazing if true) is that a classical supercomputer would take 10,000 years to
solve the same problem, no matter what method/algorithm it used.

~~~
sanxiyn
Of course "any algorithm" part can't be proved, because that is equivalent to
proving P vs PSPACE. (PSPACE is as powerful as any quantum computer.) Just
like everything in computational complexity, proof is under "standard
assumptions".

------
ktamiola
I do remain skeptical.

~~~
onetimemanytime
Google is sending the computer in question and a team of their best scientists
to convince you. Hang tight!

------
lawlessone
We can't give everyone a quantum computer(yet)

But could we use this quantum processor to aid the rapid design of far more
powerful classical computers?

~~~
sanxiyn
Current quantum processors are not useful for anything, so it can't aid design
of classical computers (among other things).

