
The Case Against Quantum Computing - niccl
https://spectrum.ieee.org/computing/hardware/the-case-against-quantum-computing
======
catpolice
The core argument here, concerning the number of variables that need to be
manipulated, isn't very clear to me - I can't tell if that's because it's an
unclear argument in general, or if the author tried to simplify it for
laypeople to the extent that it became unconvincingly vague for people who
know a thing or two about the subject area. Given the status of the author,
I'm willing to give him the benefit of the doubt (and also assume that some
clarity was lost in translation).

Here's an easier-to-follow argument I've heard before (also simplified for
laypeople) that might be close enough to the one the author intends to make to
be useful:

Quantum algorithms are supposed to be much more efficient than conventional
algorithms at certain kinds of tasks. That efficiency is calculated with the
assumption that each system starts in a state where it's prepared to execute
the algorithm. For conventional computers, that makes sense for a number of
reasons (e.g. data/program input is linear). For quantum computers, this
assumption doesn't necessarily make sense because it takes a significant
amount of work (in terms of measurements, etc.) to get the system (the
machinery and physical representation of the initial data) into the right
quantum state. In fact, the amount of work required scales horribly with the
size of the input (in terms of qubits). If you think of those state
preparation steps as part of the cost of the algorithm, the quantum algorithms
won't, in general, outperform the classical ones.

I've been out of the quantum computation loop for nearly a decade, so I have
no idea whether that's a fair way of presenting the problem, but hopefully it
helps someone understand the claims being made in the article.

~~~
greeneggs
The argument here is nonsensical.

For example, "With what exactitude must, say, the square root of 2 (an
irrational number that enters into many of the relevant quantum operations) be
experimentally realized? Should it be approximated as 1.41 or as
1.41421356237? Or is even more precision needed? Amazingly, not only are there
no clear answers to these crucial questions, but they were never even
discussed!" This is completely wrong. Of course the field has studied this
problem; people aren't idiots.

But your concern is also wrong. Preparing the initial state is easy; quantum
algorithms generally start in the all-0s state. Even quantum devices like the
D-Wave machine, for which there is considerable skepticism about its workings
and power, can easily be initialized in the all-0s state.

(With error correction, one needs to prepare the _encoded_ all-0s state, which
is indeed difficult to prepare. But in principle it is doable.)

~~~
resource0x
Doable in principle? My impression was that the author doesn't argue against
this. He claims it's unlikely to be doable in practice. That's the whole point
of the article.

~~~
danishcrows
His argument is pretty poor. He refers to the threshold theorem which does
rely on certain assumptions, and then claims that these assumptions cannot be
satisfied in practice. This is apparently because "continuous quantities can
be neither measured nor manipulated exactly", however this ISN'T an assumption
of the threshold theorem. In fact the whole premise of the threshold theorem
is that they can't. It's a straw man argument, the same one he's been making
since the 1990s.

------
tezthenerd
Here is a way to see the fallacy of the _OMG, its 10^300 variables, thats
crazy_ style of “argument”.

Consider a probabilistic _classical_ algorithm on 500 bits. Perhaps a Monte
Carlo simulation of an Ising model for example.

Note first that the most general probability distribution over the 500
classical bits takes 2^500 real numbers to specify. (You have to specify
P(000…0) and P(000…1) and… P(111…1)). [You should compare this to the 2^501
real parameters it takes to specify the quantum state of 500 qubits.]

To generate the most general such distribution perhaps you are restricted to
using circuits where the gates act on at most n-bits at a time. Each gate can
be described by a 2^n x 2^n bistochastic matrix comprised of (n-1)^2 real
parameters. [You should compare this to the n^2 real parameters it takes to
specify a 2^n x 2^n unitary matrix for a quantum gate acting on n qubits.]

Obviously its nuts to imagine you can generate all classical probability
distributions over the 2^500 real parameters, particularly if you’re so mad as
to think you are going to do it using a circuit comprised only of these n-bit
gates!

Therefore useful classical monte carlo computing is obviously _decades_ away.

(Oh and please trust me, I'm well know in stuff that isn't quantum computing.)

~~~
tezthenerd
Let me present it a different way.

Someone comes to you with two formal models of computing. Both models involve
representing the state of the computer as a vector of real numbers, they both
involve finite dimensional subsystems combined with a tensor product, both
involve gates defined over the reals also combined via the tensor product and
so on. That is, both models are just about evolution of a vector in some (very
high) dimensional real vector space according to gates acting on a small
number of subsystems. In fact these two models are _identical_ , except for
the fact that in model A the readout procedure involves computing a property
of the output vector with the 1-norm, while in model B the 2-norm is used.

This is not an analogy, these are valid mathematical formulations of classical
and quantum computing, the correspondences (and differences!) are well
understood and rigorous.

Now you read an IEEE article that vociferously objects to the feasibility of
building a computer based on model B, but all the objections are to do with
properties of model B that it fully shares with model A. And model A you know
can be very well approximated already in the physical world, which means
reality was somehow was not inhibited by those objections. To try and refute
the physicality of model B with an argument based on premises already
satisfied by model A is silly.

(Note that even if it were the case that complex numbers were necessary for
quantum computing, which they are not - see eg. my book Q is for Quantum - you
can map the quantum density matrix on n qubits to a real vector over the basis
of Hermitian matrices).

~~~
dhammack
Hey I'm not familiar with this way of comparing classical and quantum
computation. Can you point me to some more details? I have Nielson's book but
don't remember seeing this analogy before!

~~~
tezthenerd
I presume its explained in Scott Asronsons book, its implicitly there in
Nielsen and Chuang. But the best way to understand it is by example - try to
write out how you would describe classical probabilistic computation on two
classical bits to mimic the quantum circuit type of picture, and if you
succeed the generalization will be obvious.

------
mcguire
" _While a conventional computer with N bits at any given moment must be in
one of its 2N possible states, the state of a quantum computer with N qubits
is described by the values of the 2N quantum amplitudes, which are continuous
parameters (ones that can take on any value, not just a 0 or a 1)._ "

If nothing else, that seems to me to be the clearest description of quantum
computing I've seen.

~~~
tezthenerd
[typo note its 2^N quantum amplitudes]

A clear description but potentially very misleading and one that will
certainly screw up your intuition about what to expect from quantum computers.

As pointed out below, a classical probability distribution of N bits is a 2^N
dimensional vector of real numbers. There are many very good reasons to think
of the quantum state as "more like" this classical vector than the physical
state of the "N classical bits" themselves.

Here is one:

\- If I send you the physical systems which encode N classical bits then sure
enough, I can communicate to you only N classical bits of information despite
it taking exponentially many real parameters to specify the distribution/state
I prepared them in.

\- If I send you the physical systems which encode N quantum bits then sure
enough, I can communicate to you only N classical bits of information despite
it taking exponentially many real parameters to specify the distribution/state
I prepared them in.

There are many other reasons to think of quantum states as not "inherently
real" and more like the classical probability distribution. A key one is that
both "instantaneously collapse" when you get information about the outcome of
an observation.

The issue of course is that while we know the "real states of the world are"
of a conventional computer, nobody agrees on what (if any) they are for
quantum systems (though many constraints on such purported real states are
known, I write about some of them in _Q is for Quantum_)

~~~
wjhuggins
Actually, if you send a physical system which encodes N quantum bits you can
send up to 2N classical bits.

(check out
[https://en.wikipedia.org/wiki/Superdense_coding](https://en.wikipedia.org/wiki/Superdense_coding))

------
lewisinc
Shame on this article. We should be researching everything within moral
limits. Everything. Research all of it so hard that there's no long any debate
about whether this point or that has any standing because it's so well
understood that we can just fucking move on. Regardless of the topic, if we
aren't sure if it's of value - we learn as much as we can until we can allow
ourselves as a species to move on. That's the only way to prove any point in
the realm of curiosity that satisfies me. Articles that argue the validity of
investgating an unknown for any reason other than "it would cause significant
harm" don't hold any weight and should be disregarded as either being based in
fear, self interest, or simple ignorance.

~~~
kisstheblade
He said that the experimental research was useful, so shame on you. What kind
of a silly strawman are you trying to build.

I can see from the responses here that the field is obviously quite volatile
and full of big egos with practicioners taking criticisms so personally :)

------
nixpulvis
I still need to gain the intuition for why a quantum computer can operate on
some kinds of things "faster". I've read some of the math, but that did little
to satisfy me (I need to study it more clearly).

But all of this seems in a tragic state at the moment. Allowing rampant
misinformation and hype as to what these machines are actually capable of.

~~~
jcranmer
I'm going to butcher a lot of the physics here, but here's the gist of it that
should avoid some quantum myths:

The quantum state of a system is not described by a simple real probability
but by a complex number. This means that there is an additional degree of
freedom that you can adjust in a quantum state that doesn't change its
probability of occurring. The nature of entanglement means that there is
constructive and destructive interference when you go through certain quantum
operations.

Traditionally, a quantum word can be thought of as being a probability
distribution over the various possible values of the word. You can build
quantum gates that act like classical gates on the binary values, and it's
this property that often causes people to describe a quantum computer as
executing every possibility in parallel. The difficulty is that the result you
get is sampled by the probability distribution, which means you need some sort
of interference between the results to allow you to boost the probability of
picking the answer you want. A quantum speedup is only possible when you can
find the interference. In the case of Shor's algorithm (for factoring), the
interference is basically a quantum variant on FFT; in the case of Grover's
algorithm (finding x such that f(x) = y), the interference is something that
redistributes the probabilities among a plane of the sphere.

~~~
nixpulvis
OK, yes. This builds on the parts of my intuition that already exist. It's
about building a system (from combinations of quantum operators) that
maximizes the probability distribution of the result you want. I'm still
unsure of the timeline of any quantum operator, and the "finding the
interference" seems equivalent to saying something like "find the function" or
something more classical.

~~~
The_rationalist
Yes I would really like to understand if this is different than curve fitting.
How isn't that classical ? What about causality ? Quantum mechanics to me are
just an effective model for dealing with uncertainty AKA probabilistic/meta
probabilistic reasoning. But it is about statistically predicting the state of
a particle/particles (position, spin, mass, momentum, etc). How can this model
uncertainety in algorithms is something I don't understand at all. My brain is
very tempted to claim that quantum computing is bullshit and that is what it
seems to a layman like me because i've never read a great explanation. Even if
quantum computing worked it would be soft computing ? By having a set of
probabilities as results, that would be an approximation thus effectively
allowing to be less than exponential for np hard and less correct. How does
that differ from classical heuristics algorithms that already have massive
Speed like SAT solvers ?

~~~
maththrowawtays
Firstly, you're going to run into walls if you try to fully understand why the
mathematics of quantum mechanics are how they are. Physicists have been going
at it for decades and from what I can tell the picture is not so clear yet.
What has been done though is many thousands of experiments showing that the
predictions the mathematical theory makes (even the wild, unintuitive ones)
are correct.

Secondly, I think you're going too deep on this whole 'prediction' thing. The
algorithms in quantum computing aren't trying to model something any more than
a classical algorithm does. Currently, the notion of a quantum algorithm is
just a series of special logic gates that act on qubits instead of bits. What
the previous poster said above is correct. Quantum computers are like
classical computers with randomization except instead of real probabilities it
has what are called complex amplitudes. The way the complex amplitudes behave
allow for the states to interact slightly different since they can now
'interfere' with each other to cancel each other out in very specific
situations. This power gives quantum computers only slightly more power the
classical ones (with randomization). (Most) experts don't think quantum
computers can solve NP-hard problems. This power from the interference only
helps in very specific problems.

------
zeroxfe
Had to read to the bottom to get to the (really weak) "case against Quantum
Computing":

> I believe that, appearances to the contrary, the quantum computing fervor is
> nearing its end. That’s because a few decades is the maximum lifetime of any
> big bubble in technology or science. After a certain period, too many
> unfulfilled promises have been made, and anyone who has been following the
> topic starts to get annoyed by further announcements of impending
> breakthroughs. What’s more, by that time all the tenured faculty positions
> in the field are already occupied. The proponents have grown older and less
> zealous, while the younger generation seeks something completely new and
> more likely to succeed.

I was really hoping for a real technical argument, not speculation about
bubbles or politics in academia. (Especially given that most of the real
research is done in industry.)

~~~
TheOtherHobbes
The technical argument is absolutely clear: quantum computing cannot work
because it relies on manipulating and measuring an absolutely astronomical
number of continuous variables with near-infinite precision.

The argument may or may not be correct, but it deserves something more
thoughtful than a dismissive response that doesn't even recognise the basic
point the author is making.

~~~
krastanov
This is one of the standard complaints (the gist of it being that quantum
computing is some form of analog computing, i.e. requiring near-infinite
precision). For researchers in the field it becomes rather frustrating to have
to repeat the same response without being heard, so I can understand the
annoyance expressed in the parent comment. For what is worth, here is a good
explanation of how this complaint misrepresents the whole premise of quantum
computation (in particular, point 6):
[https://www.scottaaronson.com/democritus/lec14.html](https://www.scottaaronson.com/democritus/lec14.html)

~~~
core-questions
> the gist of it being that quantum computing is some form of analog computing

It absolutely is - at least, in the only practical, real-today, working
instantiation of it which is in the form of quantum annealing.

~~~
krastanov
This is an extremely misleading statement. Quantum annealing is most certainly
not what quantum computing is about (it is not particularly "quantum" either).
At most, you can argue that it is an important first step (which is also
doubtful). Most of the interesting hardware currently being developed has
nothing to do with quantum annealing and most researchers are fairly annoyed
at D-wave for originally pushing this nonsense (D-wave is the company that
started selling quantum annealing machines, which are little more than a
classical analog computer).

See the ion traps developed in the UK/Maryland or the transmon qubits at
Google/IBM/Yale for some well known prototypes of quantum computing hardware
(which are admittedly quite far from being practical or usable).

~~~
jackfraser
> Quantum annealing is most certainly not what quantum computing is about

Near as I can tell, it's the only form of practical quantum computation that's
available to end-users in any meaningful way. Everything else so far is
vaporware with ridiculously high error rates.

> (it is not particularly "quantum" either)

The qubits - josephson junctions and superconducting loops - are indeed in
quantum superposition with each other if their claims are true. This is a
macroscopic quantum-mechanical effect; to call it "not particularly quantum"
belies a lack of understanding as to what they're doing.

> most researchers are fairly annoyed at D-wave for originally pushing this
> nonsense

D-Wave is making their system publicly available for free. Have you signed up
and taken a look at their demos? Admittedly it's a bit beyond my mathematical
ability but it's clear the basic seed is there for something that works; they
just need a bigger graph and more connections between qubits. There's no
reason to assume they won't have their own equivalent of Moore's Law at some
point where these things grow year over year.

> transmon qubits at Google/IBM/Yale for some well known prototypes of quantum
> computing hardware (which are admittedly quite far from being practical or
> usable).

Do they have _any_ results worth caring about yet? I don't think they're
anywhere near something working on a practical scale. You can buy a D-Wave
machine, or time on it - it's not vaporware, even if most people don't find it
useful. For the first few years of gate model machines people aren't really
going to find them all that useful for day to day things either - just like
computers in the 1940s and 50s, it was very specific, high-dollar applications
first, and things for the common man decades later.

~~~
krastanov
You mentioned that some of the details are beyond your math skills. The reason
I am so vocal against claims that Dwave have a quantum computer is because I
work in this field (Yale Quantum Institute) and these details are what my work
is about.

The "qubits" Dwave has most certainly are not entangled and even they have
never claimed it. Dwave themselves have stopped claiming they have a quantum
computer, and have introduced this term "quantum annealing" which is something
that you can more efficiently do on a classical computer.

Yes, you can buy something from Dwave, so it is not vaporware, it is just
snake oil. This is why it is frustrating for me to respond to your comments:
you are dismissing the very measurable progress my colleagues have done over
the last two decades (just look at qubit lifetimes, "break even" error
correction procedures, generation of entangled pairs, etc) while claiming that
a device that is incapable of sustaining any particularly interesting quantum
state is to be held in high regard.

And while Dwave's engineering efforts should be praised, their misleading PR
has led to false misleading claims like yours and for that at least their PR
department deserves to be shunned.

------
amelius
I'm wondering what will be first: practical fusion reactors or practical QC.

Also, will we at some point be able to linearly convert energy into compute
power. In that case: would we still need QC?

(You might say that this conversion is already possible, but it requires human
interaction)

~~~
TomMarius
Fusion reactor development was slowed down by the problem of
superconductivity, which seems to be close to solved now (theoretically, of
course).

------
maththrowawtays
Many of the comments have pointed out this flaw in the argument. I will try to
post my own summary of this flaw:

The problem is that this case against quantum computing due to the sheer scale
of the number of parameters is also a case against randomized computing, but
randomized computing certainly exists in the real world (a computer with a
coin flipper). A 'state' in randomized computing over N bits is specified by a
probability being assigned to each of the 2^N possible states of the N bits.

For example, to describe the distribution after flipping 100 coins, you
require 2^100 real numbers to specify the probabilities for each outcome,
since there are 2^100 combinations of heads and tails you can get. But it is
very clear that we can do this in real life. The point is, you don't need
2^100 parameters with full precision to flip a coin 100 times.

To be more precise, we can write the state as a 2^N length vector, and our
operations (think logic gates) on the vector are called stochastic matrices
(this is the type of matrix required to so that the operation maps states to
states).

Quantum computing is no different, except that instead of using real numbers
for probabilities, complex numbers are used and they are called amplitudes
instead. Also, instead of stochastic matrices for operations, they are now
unitary matrices. The point is, the scale of the number of parameters is the
same, the only thing that's changed is that it is complex numbers instead of
real numbers.

Now an argument might be that it is not possible to create gates with a very
specific probability distribution with just a discrete and not very precise
coin flipper, but it's actually possible to show that for randomized
computing, you only need a constant factor more flips to get exponentially
close to the distributions you want. A similar result was proved pretty early
for the quantum case, and it is a foundational result. So this is not really a
problem either.

------
jvanderbot
In addition to the technical arguments against this article, I say: "When a
distinguished but elderly scientist states that something is possible, he is
almost certainly right. When he states that something is impossible, he is
very probably wrong."

[https://en.wikipedia.org/wiki/Clarke%27s_three_laws](https://en.wikipedia.org/wiki/Clarke%27s_three_laws)

------
adamnemecek
This read a lot like Clifford Stoll’s 1995 essay "Why web won't be nirvana"
[https://www.newsweek.com/clifford-stoll-why-web-wont-be-
nirv...](https://www.newsweek.com/clifford-stoll-why-web-wont-be-
nirvana-185306)

------
calibas
Has Google said what the error rate is yet for Bristlecone?

------
mathgenius
This is a lame article, bordering on click-bait.

It is totally reasonable to predict a quantum computing winter, but all the
supporting arguments here seem to be about how "hard-headed engineers" know
best.

> So the number of continuous parameters describing the state of such a useful
> quantum computer at any given moment must be at least 2^1000, which is to
> say about 10^300. That’s a very big number indeed.

Yes this is the whole point: if it wasn't this big, we could just simulate it
on a regular computer.

> To repeat: A useful quantum computer needs to process a set of continuous
> parameters that is larger than the number of subatomic particles in the
> observable universe.

Yep.

> it’s absolutely unimaginable how to keep errors under control for the 10^300
> continuous parameters that must be processed by a useful quantum computer.

I _do_ imagine it, every day. But I'm a professional, so maybe that doesn't
count?

> How many physical qubits would be required for each logical qubit? No one
> really knows...

There are plenty of such estimates around. Obviously it depends on the noise
rate of each qubit.

> all of the assumptions that theorists make about the preparation of qubits
> into a given state, the operation of the quantum gates, the reliability of
> the measurements, and so forth, cannot be fulfilled exactly. They can only
> be approached with some limited precision. So, the real question is: What
> precision is required? ... Amazingly, not only are there no clear answers to
> these crucial questions, but they were never even discussed!

There are boatloads of papers (and conferences) that discuss exactly this.
Obviously it is a vital question to answer!

~~~
kisstheblade
I think your reply is knee-jerk and "lame". Why don't you enlighten us,
instead of taking it so personally when somebody criticizes your field.

I thought the article was excellent and made good arguments.

~~~
eigenspace
How much working knowledge of quantum computing do you have? I can see why
someone not connected to the field might think the article makes good
arguments but if you are connected at all to the literature, one can see the
article is bullshit.

Many of the criticisms they’re levying here are either made up or
misunderstood.

