
Where do we stand on benchmarking the D-Wave 2? - ashutoshs11
https://plus.google.com/+QuantumAILab/posts/DymNo8DzAYi
======
foobarqux
From Scott Aaronson:

> If one considers only the ~10% of instances on which the D-Wave machine does
> best, then the machine does do slightly better on those instances than
> simulated annealing does. (Conversely, simulated annealing does better than
> the D-Wave machine on the ~75% of instances on which it does best.)
> Unfortunately, no one seems to know how to characterize the instances on
> which the D-Wave machine will do best: one just has to try it and see what
> happens! And of course, it’s extremely rare that two heuristic algorithms
> will succeed or fail on exactly the same set of instances: it’s much more
> likely that their performances will be correlated, but imperfectly. So it’s
> unclear, at least to me, whether this finding represents anything other than
> the “noise” that would inevitably occur even if one classical algorithm were
> pitted against another one.

He continues:

> However, I neglected to mention that even the slight “speedup” on ~10% of
> instances, only appears when one looks at the “quantiles of ratio”: in other
> words, when one plots the probability distribution of [Simulated annealing
> time / D-Wave time] over all instances, and then looks at (say) the ~10% of
> the distribution that’s best for the D-Wave machine. The slight speedup
> disappears when one looks at the “ratio of quantiles”: that is, when one
> (say) divides the amount of time that simulated annealing needs to solve its
> best 10% of instances, by the amount of time that the D-Wave machine needs
> to solve its best 10%. And Rønnow et al. give arguments in their paper that
> ratio of quantiles is probably the more relevant performance comparison than
> quantiles of ratio. (Incidentally, the slight speedup on a few instances
> also only appears for certain values of the parameter r, which controls how
> many possible settings there are for each coupling. Apparently it appears
> for r=1, but disappears for r=3 and r=7—thereby heightening one’s suspicion
> that we’re dealing with an artifact of the minimum annealing time or
> something like that, rather than a genuine speedup.)

~~~
joe_the_user
This seems like a boiler plate criticism of an earlier D-Wave.

My quick read of the article gives me the impression that it and the most
recent D-Wave render this criticism obsolete.

Shouldn't the higher rated comment actually related to the article at hand
(I'm not very qualified to evaluate it but it seems to give specific points
that address these objections).

~~~
d4vlx
Scott's post is from January 16th and was updated on the 26th, as far as I
understand he is using the latest results:

[http://www.scottaaronson.com/blog/?p=1643](http://www.scottaaronson.com/blog/?p=1643)

~~~
joe_the_user
I don't see anything in the parent quotes that addresses the discussion of the
article, which goes into the relative performance of the D-Wave and custom
algorithms in detail, describing situations where the D-Wave is faster;

"But importantly, if you move to problems with structure, then the hardware
does much better. See Figure 3. This example is intriguing from a physics
perspective, since it suggests co-tunneling is helping the hardware figure out
that the spins in each unit cell have to to be flipped as a block to see a
lower energy state..."

Which isn't a blanket endorsement but is a statement that there are places
where they can show D-Wave is better (Vs Aaronson's "Unfortunately, no one
seems to know how to characterize the instances on which the D-Wave machine
will do best")

Edit: Looking at Aaronsen's blog, he is using some very recent results but I
don't see any indications that these results are the same as those referenced
by the parent article here by the "Google AI Team". Maybe I'm missing
something.

------
codeulike
I like how one of the challenges with quantum computing is working out whether
any quantum computing is actually happening

~~~
Tyr42
It's more that D-Wave has gone off and abandoned the approach that everyone
else is taking with their machine, so it's not clear if you can call it a
quantum computer.

Imagine if I had a box that didn't have a cpu, or anything like a cpu, but
could still solve a particular type of computing problem. It's kinda weird to
call it a computer when it lacks the part that computes, but it still gets
results for a subset of the problems.

On top of that, the D-Wave has far larger numbers, like it's 509 q-bits for
this one, but it can't solve problems of the size we could if we had a 509
q-bit regular quantum computer. It's not really measuring the same thing.

~~~
rdtsc
Well it sort of lives in both worlds. It does some some problems better, some
not so much.

It is a quantum computers as much as you can build a specialized FPGA or ASIC
that does SHA256 hashes really well then kind of think, hmm, where I can use
this awesome feature and what problems can be reduced to computing SHA256
hashes? Is that a "computer", well, kinda, maybe.

> but it can't solve problems of the size we could if we had a 509 q-bit
> regular quantum computer.

Well yeah because it is not really a universal computer and it is
controversial and they are being sneaky about it.

------
kevingadd
Huh. So at this point it basically is believed to be an actual quantum
computer, but it can't consistently beat optimized traditional
implementations? Confusing. Does that mean that an ASIC running an efficient
traditional solver could beat the D-Wave?

~~~
gaze
Most folks in the field don't believe it's quantum. Benchmarking is a truly
stupid way to test something's quantumness. It's a bit like a company selling
fusion reactors who puts their reactor in a car, and then races it, and when
they kinda sorta beat their competitor, they say "clearly had to be fusion."
Ideally you'd try to use the zeno effect to watch the thing slow down as you
"turn off" it's quantum behavior, but d-wave haven't given us that dial. There
are then papers from d-wave that say that the thing is quantum because in an
8-qubit spin chain, they see an avoided crossing. I don't know what to think
about this, besides that the thing they are selling has many many more qubits
than 8, and the whole problem in quantum computation is scaling, ya dig?

The ones I find the most convincing are the ones that try to find a
"signature" of quantum computation, by setting the system up with a certain
connectivity that will cool down into certain ground states with certain
probability iff it's quantum. I like these. There was one from Boixo, but I
guess Vazirani's group at Berkeley figured out that this test wasn't good
enough to differentiate. There was another, again, which Boixo is on, from
Ronnow et. al. which came up with another signature, and they decided that it
wasn't quantum.

If you've been doing superconducting qubits for a while, you notice that
they're using a qubit design which even the inventor has abandoned at this
point. If you're an experimental physicist, you notice that the whole thing is
a poorly designed scientific experiment. If you're just a casual observer, and
you listen to the language they use to describe their product, you notice that
their wording is very slimy. I don't believe it's quantum.

I also think a room temperature annealer ASIC could be very, very cool. An
analog computer sort of thing. Would LOVE to see this.

~~~
kevingadd
Informative comment, thanks:

By superconducting qubits I assume you're referring to a general approach for
creating qubits that you can use for computation (I'm also assuming the
'superconducting' refers to getting some material into a superconducting state
at which point the physical/quantum properties are useful). Is that right?

And when you say the inventor has abandoned their qubit design, do you mean
the person that came up with that particular approach? Or the credited
inventor of the d-wave? Or something else?

The small amount of stuff I've read examining their claims definitely made it
seem like they were weasel-wording things, but I've never understood enough
about quantum mechanics (or quantum computing) to attempt to examine them
directly. Thanks.

I hadn't realized that annealing could be done in an analog fashion - that's
pretty cool. I can see how in that case an ASIC could actually dramatically
outperform a digital model, then?

Also, 'room temperature' \- does the D-Wave have to be cooled to some absurdly
low temperature to work? I can see how that would be necessary if it relies on
superconductivity.

~~~
vilhelm_s
> does the D-Wave have to be cooled to some absurdly low temperature to work?

Yeah, 20 millikelvin. It's cooled with liquid Helium.

([http://www.nas.nasa.gov/quantum/quantumcomp.html](http://www.nas.nasa.gov/quantum/quantumcomp.html))

------
d4vlx
My biggest problem with these results is that they are comparing the D-Wave
machine against classical simulations that are built to solve the same
problems the D-Wave is best at solving. The gold standard in the quantum
algorithms world is to show that you can solve a given problem faster than any
possible classical approach in the O(n) sense. This has been done for many
problems, notably search, which can be done in sqrt(n) vs n/2\. Factoring is
faster but has not been proven to be faster than all possible classical
approaches.

Another way to put this is that what this post is saying is that problem x can
be solved y times faster on the D-Wave machine when compared to classical
systems designed to act like the D-Wave machine. There are many other
classical approaches that could be use to solve x. If the problem was
significant enough a best know classical algorithm could be devised and
implemented on an asic. Could the D-Wave machine beat this approach?

Or for less significant problems, could the cost of the D-Wave machine compare
to paying someone to devise an special purpose algorithm per problem and
implement it in C, Java, C#, Haskell / other fast languages. How much does it
cost to pay someone to do this one something like top coder?

~~~
joe_the_user
Sure but D-Wave is pretty much a "Silver-not-gold" level quantum computing
device.

They are, more or less, clear that they aren't doing real, general purpose
quantum computing but special purpose quantum computing.

It would be great to have a general purpose quantum computer. D-Wave is spec'd
as such, sorry.

It's possible that D-Wave is a clever special-purpose classical machine that's
just being hyped as a quantum machine. But the linked article seems to give
good evidence that D-Wave is what it claims to be; a special purpose quantum
computer, which is indeed different than a general purpose quantum computer.

One thing to think about is that D-Wave may be going just a little beyond
what's possible but, apparently, using quantum computing to go that little,
they learning something possible interesting about creating a quantum
computing device.

~~~
d4vlx
I don't have any issue with the fact that it is special purpose, I have a hard
time calling it a success before it can be shown to be better than any known
classical approaches at something.

I love the fact that they are trying and are spreading awareness of quantum
computing but I take issue with some of their claims. They have a long history
of making unsubstantiated and sensationalized claims. They also have a
portfolio of patents many of which were arguably discovered by someone else or
are overly broad.

------
dmlorenzetti
_You can create much tougher classical competition by writing highly optimized
code that accounts for the sparse connectivity structure of the current D-Wave
chip._

Why should optimized code "account for the sparse connectivity of the D-Wave
chip"? If the contest is to solve an optimization problem, then the "classical
competition" doesn't need to know anything about the D-Wave. The only reason a
competing code would need to model the D-Wave itself would be if it's
simulating how the D-Wave solves an optimization problem-- in which case
comparing the times taken to do the optimization is hardly meaningful.

~~~
pilom
The problem is that the "non-optimized" solutions assume that every quibit can
talk to every other one. So they are modelling what the quibits should do in a
perfect case and modelling the perfect case is much slower than modelling what
we actually have.

So the fastest ones are the ones that say, oh, the best answer that the D-Wave
is going to give will be X so I don't need to be any more accurate than that.
I'll just model the bare minimum necessary to have the same accuracy as the
D-wave. Its like lossy compression vs lossless compression. If you are willing
to have lossy compression you can do much better and faster. The D-wave is
lossy (because it has a "sparse connectivity structure") and thus the other
solutions which are also lossy are better comparisons on a speed benchmark.

------
keithwarren
Just when I feel like I am finally getting the hang of this computer geek
thing (it took 20 years) then some whipper snappers go do something fancy and
make me feel inferior. Hail to the smart people!

------
danbmil99
Quantum Computing is the new Cold Fusion.

