
Google, D-Wave, and the case of the factor-10^8 speedup - apsec112
http://www.scottaaronson.com/blog/?p=2555
======
calhoun137
I found this remark from the linked pdf[1] very interesting:

"...it has also been shown that for the class of problems used in initial
benchmarks,the performance of the devices also correlates well with a
classical(meanfield) approximation of quantum annealing [4],which raises
questions about whether the devices can outperform classical optimizers."

In other words, if you ran a simulation of this device on a normal computer
and gave it the same problem to solve, you will get basically the same
efficiency.

[1]
[http://www.scottaaronson.com/troyer.pdf](http://www.scottaaronson.com/troyer.pdf)

~~~
mtgx
The question is can they still say that when D-Wave reaches say 4,000 qubits?
What I'm asking is will classical optimizers and classical computers _always_
be able to be "roughly as fast" as D-Wave for that work, or will D-Wave
_inevitably_ surpass them by orders of magnitude in future iterations?

~~~
johncolanduoni
That is the central open question. That's what Scott means when he talks about
asymptotics, and the conclusion of the Google paper is that we have probable
evidence that with enough qubits and a big enough problem it will be faster
for a very specific problem compared to a non-optimal classical algorithm (we
have ones that are for sure better).

This probably sounds like a somewhat useless result (quantum computer beats
B-team classical algorithm), but it is in fact interesting because D-Wave's
computers are designed to perform quantum annealing and they are comparing it
to simulated annealing (the somewhat analogous classical algorithm). However
they only found evidence of a constant (i.e. one that 4000 qubits wouldn't
help with) speed up (though a large one) compared to a somewhat better
algorithm (Quantum Monte Carlo, which is ironically not a quantum algorithm),
and they still can't beat an even better classical algorithm (Selby's) at all,
even in a way that won't scale.

Scott's central thesis is that although it is possible there could be a
turning point past 2000 qubits where the D-Wave will beat our best classical
alternative, none of the data collected so far suggests that. So it's possible
that a 4000 qubit D-Wave machine will exhibit this trend, but there is no
evidence of it (yet) from examining a 2000 qubit machine. Scott's central
gripe with D-Wave's approach is that they don't have any even pie-in-the-sky
theoretical reason to expect this to happen, and scaling up quantum computers
without breaking the entire process is much harder than for classical
computers so making them _even bigger_ doesn't seem like a solution. My
personal frustration with D-Wave's marketing is that the fact that their
computers are super specialized often gets lost; these computers fail the
equivalent quantum notion of a complete set of digital logic gates.

~~~
jacquesm
But that's the wrong comparison. The right one would be a special purpose FPGA
or ASIC based computer built specifically for the simulated annealing problem.

~~~
johncolanduoni
We mostly care about asymptotic speed ups, so that doesn't really make a
difference, and using a more specialized classical computer has a similar kind
of effect to just using a bigger one. In any case our best algorithms running
on mediocre classical hardware can still beat the D-Wave, so we don't really
need to stack the deck so much.

------
Fede_V
Scott Aaronson is fucking awesome. Whenever there's a new paper claiming
breakthroughs about quantum computing or P=NP, I go to his blog and wait for
him to clearly explain it.

He is the very rare case of a top notch scientist with a gift for explaining
things well.

------
bhouston
It isn't that clear but Scott Aaronson did make an admission in this article
that I do not believe he has made before -- he admitted the D-Wave compute
exhibited quantum effects:

"As far as I’m concerned, this completely nails down the case for
computationally-relevant collective quantum tunneling in the D-Wave machine"

I believe many of his previous blog posts had argued strongly that there was
no proof of any quantum effects in D-Wave computers, but I could be
misremembering.

~~~
shoyer
It's been clear that the D-Wave machine is exhibiting quantum effects for
several years now. See Aaronson's blog
([http://www.scottaaronson.com/blog/?p=1400](http://www.scottaaronson.com/blog/?p=1400))
and this Nature Communications paper
([http://www.nature.com/ncomms/journal/v4/n5/abs/ncomms2920.ht...](http://www.nature.com/ncomms/journal/v4/n5/abs/ncomms2920.html)).

On the other hand, it's still not clear that this results in computationally
relevant speedups. In my mind, this is not much of a shift in the status quo.

------
coldcode
Can someone explain this to those of use who have no clue what these
calculations are good for? If this machine is faster to do something, what
would you use that something for?

~~~
cjmoran
The YouTube channel "In a Nutshell" released a good summary on this recently:
[https://www.youtube.com/watch?v=JhHMJCUmq28](https://www.youtube.com/watch?v=JhHMJCUmq28)

~~~
archgoon
That video is about Gate Quantum Computers; DWave machines are NOT gate
quantum computers; they call their machine quantum annealing machines. It is
not known the complexity class of problems that can be solved efficiently by
quantum annealing machines, or if that class is equivalent to classical
machines.

The result shows that the DWave machine is asymptotically faster than the
Simulated Annealing algorithm (yay!), which suggests that it is executing the
Quantum Annealing algorithm. However, the the paper also explicitly states
that this does not mean that the Dwave machine is exhibiting a 'quantum
speedup'. To do this, they would need to show it to outperform the best known
classical algorithm, which as the paper acknowledges, it does not.

What the paper _does_ seem to be showing is that the machine in question is
actually fundamentally quantum in nature; it's just not clear yet that that
the type of quantum computer it is is an improvement over classical ones.

------
adrianbg
D-Wave's computers work as intended and exhibit "computationally relevant
quantum effects" but we still don't know whether the strategy they are using
will ever yield practical advantages. Other groups (Google, University of
Maryland) seem likely to achieve genuine quantum speedups over best-known
classical algorithms within the next few years.

~~~
raverbashing
Yes

But that's the evolution of technology. First cars were not a significant
advantage over horses

Sometimes it is much more efficient from the start (like transistors) but this
is rare.

~~~
adrianbg
Not efficient at the start is a little different than "we have no reason to
think it will ever be more efficient".

------
sytelus
While Scott Aaronson is highly respected, I think we are rushing in to put
down D-Wave guys. They seem very conservative in their claims. We should
celebrate that we have come even this far that we are actually constructing
these machines that were just research fantasies few decades ago and are at a
point where we can even think about doing comparison with best we have built.
I don't think anyone in their right mind thinks D-Wave or other QC will
replace classical machines in next 5-10 years period. It's the progress that
we need to be positive about and build on each other's work.

~~~
Strilanc
> _They seem very conservative in their claims._

Hahaha!

D-Wave's reputation is one of exaggeration and putting marketing before
science. That's how Scott ended up as de-facto "chief D-Wave skeptic": he kept
complaining about their exaggerations and the exaggerations of journalists
covering them.

(The paper being discussed now does actually seem conservative in its claims,
but it's by a team at Google; not D-Wave.)

------
touchofevil
I'm wondering if this will allow for a huge speed up in 3D rendering?
Specifically, unbiased Monte Carlo ray-tracing / path-traching renderers like
Maxwell Render. Does anyone know of any work being done in this area of
Quantum computing?

~~~
efangs
Quantum annealing could "potentially" provide a speedup for any optimization
problem that can be reduced to a spin Ising form, particularly quadratic
unconstrained binary optimization (QUBO).

Notice the emphasis on potentially, though. This paper only shows that 1) for
a particular class of problems the quantum annealer has constant speedup over
one current classical algorithms, and 2) the quantum annealer scales better
for number partitioning than a few current classical algorithms.

~~~
eveningcoffee
_for a particular class of problems the quantum annealer has constant speedup
over one current classical algorithms_ on single core CPU.

I think it would have been at least have been meaningful if they had compared
these algorithms against known parallel solutions on both CPU and GPU (and
perhaps on FPGA too, such we could see how it would potentially compare
against specialized ASIC solution).

~~~
efangs
People above are right about parallelization not being a useful benchmark.

On the other hand, benchmarking against other special purpose hardware (like
an FPGA, ASIC, RQL, etc.) is definitely of interest.

~~~
eveningcoffee
Why it is not? It would show how easily this problem is actually
parallelizable in practice.

~~~
efangs
Because both D-wave and the classical algorithms will benefit linearly with
parallelization.

If they want to make a claim about the scaling ratio between D-wave and
classical algorithms, then this linear term would cancel.

~~~
eveningcoffee
The question is what would be the actual real life speedup. If it is not
easily parallelizable then it becomes much more interesting finding.

I am not familiar with the algorithms used, but name of QMC would suggest that
this is an embarrassingly parallelizable problem, so my interest might be just
from my ignorance i.e. I am looking for assurance that my assumption actually
holds.

But if a quantum computer is demonstrated to have a huge constant speedup
against a problem that could not be easily parallelized (i.e. not this case I
assume) then cluster of classical computers could not catch up the difference.

~~~
db48x
The amount of time taken to solve a given problem in "real life" is
irrelevant.

This is for problems where a brute-force solution on a classical machine needs
O(2^n) computational steps. This is an exponential relationship; as the
problem size (measured by n) becomes large, the number of steps required
becomes vast. If each step takes a nanosecond, and n=30, then finding a
solution will only take about a second. Double the problem size to n=60,
however, and now it will take 36 years.

A quantum algorithm for the same problem might be able to run in
subexponential time. It might still be something horrible like O(n^7) and it
would still scale better than O(2^n): at n=30 it would take 22 seconds, but at
n=60 it would only take about 45 minutes.

This is why computer scientists use Big-O notation; the clock speed of the
computer is irrelevant if the algorithm scales badly. You never use bubble
sort because it scales badly; almost anything else you can come up with will
be better. Likewise, if you had a real quantum computer that could run
Grover's algorithm then you'd never factor numbers using any other method;
Grover's algorithm would always win.

~~~
eveningcoffee
_The amount of time taken to solve a given problem in "real life" is
irrelevant._

I thank you for your thorough answer.

I am not discussing the importance of the quantum speedup (that was not
demonstrated) rather than the constant speedup compared to the single core CPU
(and we know that in real life this difference does not even exist, but we can
pretend that it does, ok?).

Google "google 100 million times faster than" and you can see already
headlines poping up (for example this from Arstechica
[http://arstechnica.com/information-
technology/2015/12/google...](http://arstechnica.com/information-
technology/2015/12/google-nasa-our-quantum-computer-is-100-million-times-
faster-than-normal-pc/) They even do not mention that actually the same
problem can be solved on the single core CPU faster than on D-WAVE by using a
different algorithm).

So I am dealing with a hypothetical situation where in fact some sort of
quantum annealing computer could have a huge constant speedup compared to the
single core CPU in solving of one very important problem.

Imagine that we live in a national state that does not have access to the
quantum annealing technology (within reasonable time frame) but has state of
the art silicon fab lab.

Could we build a classical cluster of the same speed? How many CPU cores we
would need? What if we use GPUs? What if we build a problem specific chip
(ASIC)?

What if there is no easily parallelizable solution? Could we then even find a
match in problem solving speed?

I hope that it was clear that even without an asymptotic speedup there are
specific cases where a huge constant speedup would matter.

~~~
db48x
There's no way to answer that question without going into the details of every
specific problem you might want to solve, but since they've only demonstrated
a constant speedup then yes, you can always build a normal computer which has
comparable speed. Or you can improve your software; that's happened often
enough already.

------
biot
The phrasing of "factor-10^8 speedup" threw me for a bit. Is that a negative
speedup, meaning it's significantly slower? No, it's actually "10^8 faster".

~~~
tzakrajs
(-10)^8 :P

------
lovelearning
Any book that gives an overview of quantum computing to a programmer? A sort
of "thing explainer" for quantum computing?

~~~
gohrt
[http://www.amazon.com/Quantum-Computing-since-Democritus-
ebo...](http://www.amazon.com/Quantum-Computing-since-Democritus-
ebook/dp/B00B4V6IZK)

also free online

~~~
chubot
I have this book and it's great, though it's not quantum computing "for a
programmer" \-- I don't think such a thing really exists. It was based on a
series of quantum computing lectures he did for CS grad students.

If you have a solid undergraduate education in computer science theory
(computability, computational complexity, etc.) and probability, you can get
pretty far through the book and take away useful things. But simply being a
programmer doesn't mean you'll get very far.

I got pretty far but got lost toward the end. He states a LOT of results and
they become hard to remember without working through them more deeply. It's
all theory -- there's absolutely nothing like "programming" in the book.

~~~
tgb
I, too, loved the book but think it's not what the asker was looking for. In
particular, it assumes a lot of knowledge about quantum computing (it doesn't
cover, say, Shor's algorithm or quantum Fourier transforms). It's aim is to
present information about complexity theory, not "Learn to Program Quantum
Computers in 24 Days".

There was a good course on Coursera some years back that I found quite
enjoyable and was my first introduction to quantum computing. It serves as an
OK intro to quantum mechanics, too, from the abstract point of view where we
just assume we have qubits versus the physics point of view where we start by
looking at atoms. Maybe the course is still available.

------
arthurcolle
TL;DR

"Finally, on the special Chimera instances with the tall, thin energy
barriers, the authors find that the D-Wave 2X reaches the global optimum about
108 times faster than Quantum Monte Carlo running on a single-core classical
computer. But, extremely interestingly, they also find that this speedup does
not grow with problem size; instead it simply saturates at ~108. In other
words, this is a constant-factor speedup rather than an asymptotic one."

~~~
selimthegrim
10^8, not 108!

~~~
randyrand
Just for clarification (for anyone),

10^8 or 100,000,000 is still a constant factor =)

~~~
selimthegrim
Well yes but Google spending some obscene amount of money for a 108x speedup
is beyond farcical even if it's not asymptotic :)

~~~
DannyBee
Google isn't spending money for even a 10^8 speedup at this point, Google is
spending money to figure out what will be possible in the future

