
An Argument Against Quantum Computers - amaks
https://www.quantamagazine.org/gil-kalais-argument-against-quantum-computers-20180207/
======
stcredzero
A note for the savvy: A quantum computer _is not_ a magic bit-string that
mysteriously flips to the correct answer. A n-qubit quantum computer _is not_
like 2^n phantom computers running at the same time in some quantum
superposition phantom-zone. That's the popular misconception, but it's
effectively ignorant techno-woo.

Here's what really happens. If you have a string of n-qubits, when you measure
them, they might end up randomly in of of the 2^n possible configurations.
However, if you apply some operations to your string of n-qubits using
_quantum gates,_ you can usefully bias their wave equations, such that the
probabilities of certain configurations are much more likely to appear. (You
can't have too many of these operations, however, as that runs the risk of
_decoherence_.) Hopefully, you can do this in such a way, that the biased
configurations are the answer to a problem you want to solve.

So then, if you have a quantum computer in such a setup, you can run it a
bunch of times, and if everything goes well after enough iterations, you will
be able to notice a bias towards certain configurations of the string of bits.
If you can do this often enough to get statistical significance, then you can
be pretty confident you've found your answers.

[https://www.youtube.com/watch?v=IrbJYsep45E](https://www.youtube.com/watch?v=IrbJYsep45E)

[https://www.youtube.com/watch?v=wUwZZaI5u0c](https://www.youtube.com/watch?v=wUwZZaI5u0c)

EDIT: I rather like Issac Arthur, but unfortunately, his Quantum Computing
episode is an example of exactly this kind of popular misconception. I've
called him out on it in comments.

[https://www.youtube.com/watch?v=wgCuKTN8sX0](https://www.youtube.com/watch?v=wgCuKTN8sX0)

EDIT: I can't find my comment anymore, and I've also discovered that I'm now
banned from the public Facebook group! Hmmm.

EDIT: It seems that Issac did correct his video, kind of. He still seems to
advocate the 2^n parallelism, but then explains why that can't work around 18
minutes in.

~~~
wyager
Your answer doesn’t actually explain what we _can_ do on a quantum computer
that we can’t on a traditional computer.

The answer: we don’t know. We don’t even know if BQP is actually any bigger
than P.

However, we’ve figured out how to do some things quickly on quantum computers
that we haven’t ever figured out how to do quickly on classical computers,
like certain mathematically useful operations on abelian groups.

One specific example is that you can do Fourier transforms in
O(log(n)log(log(n))) rather than O(n log(n)), which is pretty cool. If your
vector space is too big to even represent in classical memory, you might still
be able to work with it on a quantum computer.

~~~
goatlover
> Your answer doesn’t actually explain what we can do on a quantum computer
> that we can’t on a traditional computer.

Why would there be anything you could compute on a quantum computer that you
can't compute on a classical one (provided the classical one is powerful
enough or given enough time for certain algorithms quantum computers are more
efficient at).

Is there speculation that a quantum computer could be used for hypercomputing?

~~~
btilly
The fundamental observation behind quantum computing is that an irreversible
bit flip releases a minimum amount of heat, while a reversible one releases
none. See
[https://en.wikipedia.org/wiki/Landauer%27s_principle](https://en.wikipedia.org/wiki/Landauer%27s_principle)
for the theoretical limits to classical computing based on this. But there
are, to the best of our current knowledge, no theoretical limits to how much
computation can happen reversibly.

However the requirement that the operations all be completely reversible is
very strict. For example logic operations like "and" and "or" are off the
table. BUT when physicists like Richard Feynman looked into, what DOES happen
is that your computation can progress in a quantum superposition of states.
And there is no upper limit to how many states are in the quantum
superposition.

So in essence what you get should be a very hard to program but unbelievably
parallel computer with no upper limit on computational speed.

The first demonstration that this could be useful for problems that people
care about was
[https://en.wikipedia.org/wiki/Shor%27s_algorithm](https://en.wikipedia.org/wiki/Shor%27s_algorithm)
for factoring integers. The fact that we do not know how to factor integers
quickly is an underpinning of public key cryptography algorithms like RSA.
However we know how to solve that if we had a quantum computer. And this is
just engineering, right?

Everyone recognizes that it is a very hard engineering problem. But the
minority opinion laid out in this article is that the engineering problem is
not just a practical problem, but the physics makes it intrinsically hard in a
way that cannot ever be worked around. No matter how well a quantum computer
works in theory, it is physically impossible for us to build and operate one
that does better than the classical computers that we already know how to
build.

~~~
deepnotderp
Okay, first of all, we are no where near Landauer's Limit right now, and will
likely never reach that point because we'll run into thermal noise "danger
zones" far before then.

Second of all, there are plenty of (and probably the _primary_ ) forms of non-
quantum reversible logic.

In the slightly less exotic category there is also adiabatic computing which
can actually be done in standard CMOS in most flavors.

------
tagrun
This illustrates the disconnect of mathematicians trying to do physics. Not
only his error model is unrealistic (you essentially need a frequency
dependent power spectral density function to represent fluctuations in control
which decays according to a power law at high frequencies, sometimes called
1/f noise but the power doesn't have to be 1; also you don't get that kind of
strong spatial correlations in real systems, cross-talk among spins more of a
realistic problem but it is not what he's doing either), his argument is based
on something that is just wrong:

> that the effort required to obtain a low enough error level for any
> implementation of universal quantum circuits increases exponentially with
> the number of qubits, and thus, quantum computers are not possible.

That's what dynamical decoupling (DD) or pulse sequences are for: you can get
_arbitrarily_ high quality quantum gates (typically, but not always, at the
cost of increasing the gate times; removing the most significant first order
error typically increases the gate time by less than an order magnitude, think
2x-4x) without increasing the number of physical qubits at all. People don't
just rely on surface codes, anyone serious about implementing a quantum
computer use surface codes _after_ DD to reduce the infidelity to the
threshold required for them. Which is why you don't need hundreds of physical
qubits to have a single robust logical qubit.

For those who are not familiar, DD is like one of the oldest tricks in the
bag, it's nothing like a new cutting edge type quantum error correction code.
In fact, the oldest form of DD, spin echo, precedes any real discussion about
quantum computers by a decade.

DD is possible essentially because quantum operations don't commute so errors
don't simply add up as they do with classical errors; this makes it possible
to obtain a better gate by carefully combining noisy gates such that
(significant) errors cancel.

~~~
stickfigure
Geezus. I have a degree in CSC from an engineering college. I did well in both
math and physics; hell, I even enjoyed them. I've been working professionally
for 25 years (with considerable success!).

I'm not sure if I could distinguish what you just said from a markov-chain-
based paper generator.

I feel old.

~~~
tagrun
You're probably around the same age as me then.

I wouldn't expect you to be familiar with it unless you've actually worked in
the academia doing research quantum information. This isn't stuff we teach in
class at all, even to PhD students, it's a part of research.

It'll also probably read like "from a markov-chain-based paper generator" to
you too, but if you're interested about how actual noise behaves, you can read
this for example:
[https://www.nature.com/articles/nphys1994](https://www.nature.com/articles/nphys1994)
(arxiv link if you don't have access:
[https://arxiv.org/abs/1101.4707](https://arxiv.org/abs/1101.4707))

While it's about flux qubits, the 1/f behavior is almost universal in all
current promising candidates, and DD virtually works in all quantum computers.

------
GuiA
I love that the journalist asked _”What do your critics say to that?”_ , and
that the researcher gave a meaningful, non disparaging reply. This should be
expected from everyone doing research work, but is sadly far from the norm.

It’s along the same lines as, in a debate, asking each participant to sum up
their opponent’s point of view in a way that they agree with. A fundamental
tool for productive discourse.

Really enjoyed the interview.

------
kirrent
Scott Aaronson still has his $100 000 wager available for an actual refutation
of scalable QC. The impetus for the bet was actually from arguing with Gil
Kalai in the comments on a blog.

[https://www.scottaaronson.com/blog/?p=902](https://www.scottaaronson.com/blog/?p=902)

~~~
smaddox
Whether or not you can make a scalable quantum computer is beside the point,
though. The question is, can you make a quantum computer that is competitive
with a classical computer, speed and/or cost wise.

I suspect that any speedup imparted by clever uses of entanglement will be
counteracted by the fundamental need for error correction and averaging over
several runs. I look forward to being proven either right or wrong, and I
suspect I will be in my lifetime.

~~~
kirrent
'I suspect that any speedup imparted by clever uses of entanglement will be
counteracted by the fundamental need for error correction and averaging over
several runs.'

Perhaps you don't understand what people mean when they say scalable quantum
computing? They mean that it scales with the error correction included.
Needing to average over several runs is already factored into the complexity
of quantum algorithms.

Also, whether you can make a scalable quantum computer isn't beside the point.
It's the entire point Kalai's trying to make. If you can make a scalable
quantum computer we already know that there are applications it will
outperform a classical computer on performance and cost.

~~~
smaddox
Perhaps I don't, but your description doesn't seem to contradict my point that
scalability is distinct from competitive with classical computing. Perhaps
people also implicitly mean "scales faster than a classical computer", in
which case I did indeed misunderstand.

On the topic of whether or not the need for averaging is already accounted for
in algorithmic complexity, my understanding is that it is not. This [1]
preprint from 2006 seems to support my understanding. If you have evidence to
the contrary, I would greatly appreciate a link.

[1] [https://arxiv.org/pdf/quant-ph/0612077](https://arxiv.org/pdf/quant-
ph/0612077)

~~~
kirrent
Shor's original paper discusses the fundamental need for several runs and
gives the time complexity.

[https://arxiv.org/pdf/quant-ph/9508027.pdf](https://arxiv.org/pdf/quant-
ph/9508027.pdf)

The paper you linked to is just saying that the need for several runs and 2006
readout error rates together led to awful scaling (though still better than
classical factorisation). That isn't surprising. Error rates in all parts of
even modern quantum computers are still too high to scale well. Especially
when repetition is called for.

Ultimately, complexity is calculated using logical qbits. Kalai doesn't think
that even with error correction codes we can get close enough. Most other
people think we can. That paper you linked to is just making the point that
we're definitely not there yet. When we do, algorithms in BQP will still need
several runs on many algorithms (the exact same way as algorithms in BPP do)
but that should already be incorporated into those algorithm's complexity.

~~~
kirrent
*Kalai doesn't think that even with error correction codes we can get sufficiently close to a logical qbit.

------
Robotbeat
The flip side of this argument is that if a quantum computer can never show
"quantum supremacy" over a classical computer, then perhaps there are much
better ways of simulating quantum phenomenon with classical computers than
we've yet discovered.

It's important to remember that the motivation for quantum computing came from
the realization that it's intractable to simulate quantum phenomenon on
classical computers. So any claim that classical computers can do just as good
as quantum needs to butt up against this realization at some point (IMHO).

~~~
AlexCoventry
I think the flipside is, if we can't do it, we discover interesting new
physics.

I'm looking forward to it either way.

------
sgillen
Here is a paper Kalai published that obviously goes into a lot more depth than
the interview.
[https://arxiv.org/abs/1605.00992](https://arxiv.org/abs/1605.00992)

------
philipov
It's annoying how they refer obliquely to theorems without ever citing what
they are talking about. Is he refering to the church-turing thesis?

~~~
sgillen
I think he is referencing several theorems about fault tolerant computing. I
think the reasoning goes that if we assume some level of correlated noise,
then these theorems tell us we need an error correcting code that runs in
O(e^n) where n is the number of quibits in the program you want to check for
errors. I could be wrong though, see my comment above where I link to the
paper.

------
pankajdoharey
I dont buy this argument, for one reason and one reason alone. When we started
integrating silicons we had similar problems of noise and distortion due to
circuits packed so close. There was always scope for corruption of signals and
then the problem got amplified when these circuits started working at higher
frequencies. A high end processor these days could be 4Ghz, such a high
frequency data transaction already creates distortion, so we employed many
schemes to circumvent these things most notably using different wire
interconnects, going smaller and using various chemical shielding. His key
argument revolves around noise/distortion which is not a good argument. If the
math doesn't work then its a real problem. Also massiveness argument is also
not correct because it was given for computers aswell that we will never be
able to build smaller computers and yet we did.

------
roywiggins
I haven't read it yet, but this looks like a fairly accessible (but still
technical) explanation of the argument:

[http://www.ams.org/journals/notices/201605/rnoti-p508.pdf](http://www.ams.org/journals/notices/201605/rnoti-p508.pdf)

------
socalnate1
Relevant SMBC: [https://www.smbc-comics.com/comic/the-
talk-3](https://www.smbc-comics.com/comic/the-talk-3)

------
reacweb
"There is a Hebrew proverb that says that trouble comes in clusters." In
France, we have a famous quote from Jacques Chirac "La merde, ça vole toujours
en escadrille !" (Shit always flies in squadrons! ).

------
lavamantis
Is there a distinction being made between the quantum computer being
impossible, and it being impossible without a future breakthrough in
engineering or physics?

~~~
jerf
If it really is exponentially more difficult to correct for noise as the
number of qubits increases, then it will be impossible to build a useful
quantum computer. We may not be able to say whether it's impossible after 24
qubits, or after 29, or after 32, etc., but we may be able to know that it's
impossible to build anything useful, if someone can rigorously mathematically
prove this.

On the other hand if nobody can provide a mathematical proof that it's
impossible, it could just be engineering. Then again, "just engineering"
doesn't prove a useful quantum computer may require practically impossible
configurations (i.e., having to be in the middle of a sphere of lead dozens of
miles in radius cooled to near absolute-zero is not necessarily _impossible_ ,
but isn't something we're going to build anytime soon), or that we simply
won't be able to figure out the requisite configurations.

Alternatively, this guy will prove to be wrong and someone will just build a
quantum computer with a couple hundred qubits, at which point we may have
enough data to observationally draw the curve rather than via math and the
simplifications required to make it tractable, and maybe it'll be fine.

------
frgtpsswrdlame
Really interesting contrarian opinion although I wish he had expanded a bit
when he said this:

 _I agree that these are unusual and perhaps even strange lines of analysis._

------
zitterbewegung
Gil Kalai can have his contrarian opinion on quantum computers. But we can
have a much simpler argument against quantum computers. That they will remain
infeasible for a long enough time for traditional computers to catch up to it.

What I mean by traditional computers catching up with it is that we will see
innovations in traditional algorithms that either make quantum computers
extremely pricey for a very long time in respect to traditional computers or
that we find innovations that will bypass the things that they are good at
without violating any of the current theories we have.

Currently, we have an enormous effort invested in our silicon architecture. We
seem to be hitting limitations to silicon and they seem to be making a large
bet on Quantum computers to eventually allow them to proceed. But, if quantum
algorithms take a long time to have useful implementations then we have a
local maxima where quantum computers aren't close enough to implement without
investing a massive amount of money and that silicon just becomes cheaper and
cheaper. When Intel finally reaches the limit of traditional computing that
doesn't mean that it won't stop being cheaper.

Intel, IBM, Microsoft, DWave, Google and other companies have small bets on
quantum computer engineering strategies that may or may not pay off. Also
there seems to be a bunch of researchers publishing more and more papers on
the theory of quantum algorithms. There are cross collaborations with academia
on quantum computation (and they are basically what the research institutions
work with the quantum engineering departments in those companies).

If we find that quantum computers can only speed up computations in the
limited set of algorithms then they will probably end up as accelerator cards
attached to traditional computers. I expect a order of magnitude of
inefficiency to translate problems or do work on quantum computers. Either due
to interconnecting or the way that we will figure out to operate them. So we
will see these systems relegated to supercomputer workloads.

On the algorithms side, funding the research of quantum algorithms will
probably dry up in the way that string theory research has.

Hopefully we will see a large breakthrough that will make quantum computers
feasible. From my reading of the research it seems like Topological Quantum
Computers have the best chance.

~~~
cbHXBY1D
Why are topological quantum computers more resilient? Do they handle the
problem of decoherence better?

~~~
vtomole
Yes, they do handle decoherence better. Here is a video explaining the theory
behind the particles that are used to perform topological quantum computation
[0]. I emphasize that unlike superconducting, quantum dots, ion traps,
photonic e.t.c; topological quantum computers have not been physically
realized because non-abelian anyons have not been detected.

[0]:
[https://www.youtube.com/watch?v=hKFecm9NKbM](https://www.youtube.com/watch?v=hKFecm9NKbM)

~~~
Robotbeat
...they have, if you believe this (fairly convincing!) set of evidence showing
perfect Andreev Reflection (robust to changes in gate potential), which should
only be possible (in a way robust to changes in gate potential) in the
presence of Majorana states (i.e. a type of non-abelian anyon):
[https://quantumfrontiers.com/2017/11/21/majorana-
update/](https://quantumfrontiers.com/2017/11/21/majorana-update/)

