
D-Wave 2000Q hands-on: Steep learning curve for quantum computing - feross
https://arstechnica.com/science/2019/03/d-wave-2000q-hands-on-steep-learning-curve-for-quantum-computing/
======
sfobiab
I've met the founder or D-Wave and have known them casually over the last few
years. Few people epitomize scam artist and scientific buffoonery quite as
much as this person.

They are now on to their 2nd or 3rd post-D-Wave company and each time they
have taken on massive funding rounds at large valuations and then have moved
on to form a new company. Call me old fashioned, but something doesn't smell
right.

I hope D-Wave, Rigetti and others can move our field forward, but this company
more than any other feels like a big shell game.

~~~
bitL
If that person specializes in very early company stages, getting some
knowledgeable team and funding, then moving on when all pieces are in place,
it might be a valid strategy. How many investors have any clue about quantum
computing or are even capable of understanding anything beyond simple
examples/hype in order to finance it? The business world's level of technical
expertise is really low even with informed investors and buffoonery about
anything is rampant.

------
snazzycalynx
I love that we have what is essentially part of an intro-level physics course
as part of a software review. This sounds like a totally different paradigm of
programming, which I guess it would need to be in order to take advantage of
quantum entanglement. Neat, though I guess what it does say is that quantum
computing is going to basically be its "own separate thing" for quite a while
(if not forever).

------
fsh
I don't get what the author is trying to say. It should be obvious to any
physicist that his "classical" algorithm for the Bragg grating is complete
nonsense (as he now admits in the editor's note). Furthermore, he does not
show that the problem the annealer is tasked to solve has anything to do with
the physics of a Bragg grating. Essentially he is comparing a wrong physics
simulation with the result of "quantum annealing" some random Hamiltonian that
looks vaguely similar. I don't see how anything can be learned from that.

~~~
drcode
I think the author deserves credit for an honest attempt to roll up their
sleeves and using this opportunity to encode their own attempt from scratch
and reporting the results...

Requiring an actually meaningful scientific finding from a journalist who is
merely reporting on a new programming API seems a bit picky.

~~~
fsh
In this case the author should make it very clear that what he is doing with
the API doesn't make much sense. Judging from the comments, most readers seem
to believe that he is describing a legitimate use-case for the machine. In
fact, the question whether quantum annealers can be at all useful at solving
any real problem is still very much open.

~~~
908087
The Dunning-Kruger effect is rampant in the comments of "tech news" websites,
particularly when it comes to complex subjects that people want to pretend to
understand.

Objectively false information is often upvoted to the top, while the people
trying to point out why that information is verifiably incorrect are buried in
downvotes.

The fact that quantum computing is something that self-proclaimed "futurists"
have latched themselves to guarantees that it will be even worse when that
particular topic is being discussed.

------
evrydayhustling
Q for quantum experts: is quantum computing the Manhattan project of
computing, where the effort to harness new physics simply requires unfettered
capital or tolerance for few incremental results with real applications? Or is
it actually even _more_ theoretical and complex, so that we don't have a good
historical example for the effort involved?

------
zippie
The standardized way of defining what a universal quantum computer is through
the DiVicenzo criteria [1].

Quantum annealing is a bridge until we have universal QC, it’s a fairly decent
one for optimization problems where the solution set is discrete. I personally
think D-wave is a good place to start to wrap ones’ mind around QC and
superposition states. IBM Q is great, too.

In many ways, I prefer IBM Q if you are getting into QC for the long haul
because many QC concepts such as CNOT, Z, X, Hadamard gates, interaction
between quantum registers and classic registers are explicit. D-wave’s API
hides quite a bit of what QC is all about, and you have to fit your problem
(it has to be adiabatic in nature) into their system. I find this to be rather
confusing in the long run.

[1]
[https://en.m.wikipedia.org/wiki/DiVincenzo%27s_criteria](https://en.m.wikipedia.org/wiki/DiVincenzo%27s_criteria)

------
rocqua
Towards the end, the article suddenly states:

> That is where the real power of annealing lies

From context, it seems he means 'quantum annealing'. Nowhere in the article is
it explained what this is. I vaguely recall hearing something about D-wave not
building the kind of quantum computer we've heard about (i.e. the kind that
can factor numbers). Is this 'quantum annealing', and what makes it different?

Is it better, worse, or plain different from the usual quantum computer?

~~~
hannob
From what I understood, yeah, it's a different kind of quantum computer, and
it's a cause of lots of confusion. I.e. when d-wave says "we already have a
xxx bit quantum computer", it sounds impressive, but it really is meaningless
compared to qbits in a "real" (and until now hypothetical) quantum computer.

Also until now D-Wave has been unable to show that their QC can do anything at
all faster than a classical computer. Which really puts this into perspective:
This is all early research. Something interesting may come out of it some day.
Or not.

~~~
klyrs
> I.e. when d-wave says "we already have a xxx bit quantum computer", it
> sounds impressive, but it really is meaningless compared to qbits in a
> "real" (and until now hypothetical) quantum computer.

Only, gate model quantum computers require error correction. Which doesn't
sound bad except that thousands of qubits are required to implement a single
error corrected qubit. I.e. when IBM says "we already have a 5 qubit quantum
computer", it sounds impressive, but it really is meaningless compared to
qubits in the theoretical framework that promises a speedup.

If we're willing to accept unimplemented theory in favor of scalable
architecture, then adiabatic and gate model are known to be polynomially
equivalent. So far, no quantum computing effort has shown superiority over
classical computers, but you don't see this kind of skepticism over IBM and
Rigetti's breathless headlines...

~~~
ianai
Do you mean error correcting a single qubit requires thousands of separate
qubits? As a paradox?

~~~
klyrs
Not at all. We have a similar situation with classical computing, where a
"bit" of error-corrected memory is comprised of 3 or more bits of physical
memory.

------
saagarjha
If anyone is looking to try out a quantum circuit, I found the IBM "Q
Experience" to be pretty good:
[https://quantumexperience.ng.bluemix.net/qx/editor](https://quantumexperience.ng.bluemix.net/qx/editor).
It simulates simple 5-qubit quantum circuits, and even lets you run them on a
"real" quantum computer (within limit).

------
Halluxfboy009
From past articles on Ars as well as some online quantum physics courses from
a few years back, there was a lot of healthy skepticism about what the D-Wave
was actually doing. I know from the articles here that Chris at least thinks
there is some merit to the computer. And in the past few years there seems to
be a larger opinion that there is some sort of entanglement working within the
machine.

------
WMCRUN
Not to nitpick but a steep learning curve actually implies that progress
accrues quickly after starting (learning/time). A flat learning curve implies
difficulty.

~~~
saidajigumi
To nitpick: almost everyone I've heard use this in casual conversation (who
isn't nitpicking) uses "steep learning curve" as in physical analogy, e.g.
bicycling to a goal, rather than describing a function plot. A "steep" path is
harder going than a "flat" or "shallow" one.

------
nraynaud
Quantum computing: eventually we’ll factor numbers bigger than 15 :p

Can we do SAT solving on those?

~~~
klyrs
Looks like DWave can factor 10-bit semiprimes pretty reliably. But you're not
wrong about gate model :p

[https://www.dwavesys.com/sites/default/files/14-1002A_B_tr_B...](https://www.dwavesys.com/sites/default/files/14-1002A_B_tr_Boosting_integer_factorization_via_quantum_annealing_offsets.pdf)

------
Chico11Kidlet
Even though I didn't understand it, I still found the article very
interesting. I think I could understand it if I studied for a while. Sadly, I
lack the time.

