D-Wave might be the belle of the ball today, but once the crowd turns, it could ruin an entire industry for a very long time. That's why skepticism is important when it comes to their claims.
Quoting Scott Aaronson's post:
As I said above, at the time McGeoch and Wang’s paper was released to the media [...] the “highly tuned implementation” of simulated annealing that they ask for had already been written and tested, and the result was that it outperformed the D-Wave machine on all instance sizes tested. In other words, their comparison to CPLEX had already been superseded by a much more informative comparison—one that gave the “opposite” result—before it ever became public. For obvious reasons, most press reports have simply ignored this fact.
"The majority of that post is simply factually incorrect.
I used to find this stuff vaguely amusing in an irritating kind of way. Now I just find it boring and I wonder why anyone listens to a guy who’s been wrong exactly 100% of the time about everything. Update your priors, people!!
If you want to know what’s really going on, listen to the real experts. Like Hartmut. "
 has little information, and no numbers, or comparisons to other techniques. All it says is "Quantum Computing is Cool! And we're working on NP hard problems." It does say that they have a quantum computer from Dwave, but they make no claims as to it's actual power.
Adding emphasis to the main objectives from  "to study how quantum computing might advance machine learning." They're still researching things.  is a clear attempt to leverage the respectability of Google without actually addressing Aaronson's points.
"Even if D-Wave managed to build (say) a coherent 1,024-qubit machine satisfying all of its design specs, it’s not obvious it would outperform a classical computer on any problem of practical interest. This is true both because of the inherent limitations of the adiabatic algorithm, and because of specific concerns about the Ising spin graph problem. On the other hand, it’s also not obvious that such a machine wouldn’t outperform a classical computer on some practical problems. The experiment would be an interesting one! Of course, this uncertainty — combined with the more immediate uncertainties about whether D-Wave can build such a machine at all, and indeed, about whether they can even produce two-qubit entanglement — also means that any talk of “lining up customers” is comically premature."
* Dwave built the machines
* Aaronson concedes that Dwave has achieved entanglement with a quantum annealing system for its full 512 qubits.
* there were two big sales (Lockheed, Google)
My own prediction from 2006
There will be a quantum computer with over 100 qubits of processing capability sold either as a hardware system or whose use is made available as a commercial service by Dec 31, 2010.
128 qubit system was sold in 2010.
Quantum entangled annealing is proven in the USC paper of the 128 qubit system that was sold.
I think the "skeptics" will bash D-Wave until, but probably after their systems are running circles around traditional computing.
> For years, I tirelessly repeated that D-Wave hadn’t even provided evidence that its qubits were entangled—and that, while you can have entanglement with no quantum speedup, you can’t possibly have a quantum speedup without at least the capacity to generate entanglement. Now, I’d say, D-Wave finally has cleared the evidence-for-entanglement bar—and, while they’re not the first to do so with superconducting qubits, they’re certainly the first to do so with so many superconducting qubits.
In other words, this is the first actual evidence that D-Wave even has the potential for quantum speedup.
edit: a more serious answer is in the article, entanglement or not the algorithm naturally converge towards the solution. And they did not even check with convergence speed (I think the quantic version is meant to be faster) but they checked on some statistical criterion that is different in the quantic version and the plain version.
That's exactly what Wernher von Braun did with his early rocket designs. He (intelligently and quickly) iterated, keeping the parts that worked and getting rid of the parts that didn't.
That sounds interesting. Any link?
The D-Wave is not a quantum computer (as the term is normaly used), and was never meant to be one.
Well, that something else is working, and altough a nice physics experiment, it still lacks any real world utility.
1. Assign all states of each variable equal probability.
2. Sample the living crap out of the search space with some sort of energy function for every possible configuration, scoring the discrete values of the variables of each individual sample by Boltzmann weighting of the energy function.
3. Every so often update the weights for selecting each variables potential values using the accumulated scores for each variable generated during step 2.
4. Repeat steps 2 through 3 until each variable converges to a single state.
I never published anything but I learned three things from this process.
1. It worked like gangbusters to find the space surrounding global optima of reasonably complex functions
2. It worked like crap to refine really good solutions into really great solutions.
3. It was horribly dependent on the underlying representation of the variables (i.e. if you mapped the input variables to a spin glass, it was just awful)
That and I suspect somebody already has a fancy name for exactly what I did back then...
The position of most of the scientific community at the outset regarding D-Wave quantum computers was that it was uncertain what was going on at all. Nobody knew for sure if the D-Wave computers were really using quantum entanglement when they ran or not. Obviously a computer that does computations without doing at least some of the weird things allowed by quantum mechanics wouldn't be much of a quantum computer.
It appears that the D-Wave computers could indeed be taking advantage of entanglement. However since the D-Wave computers are not very isolated from their environment, the delicate effects they attempt to harness are sometimes disrupted when the computer interacts with its environment (aka decoherence to use the Quantum Mechanics term).
Overall it looks like D-Wave is making some progress on demonstrating their computer does really harness what's allowed by quantum mechanics. This is exciting, though ironically they have not caught up to their own overstated claims of what their machine does. Perhaps with more work they can better isolate their computer from it's environment and graduate from quantum annealing to reversible adiabatic quantum computing. Or maybe someone else working with some other physical system which has an intrinsically lower coupling to it's environment might beat them to it. An exciting time for the field nonetheless.
Getting a speed up on a particular class of problem could have a great deal of practical importance, but building a scalable computer that fully takes advantage of everything allowed by the laws of physics is the holy grail of quantum computing, and it doesn't look like D-Wave is there quite yet. Still an exciting time for the field nonetheless.
 : http://www.amazon.com/Quantum-Computing-since-Democritus-Aar...
If you are just looking for the theoretical basics , this youtube series  is a good place to start.
While I do find these videos approachable, they are fairly math heavy. Most of the lectures I've seen explain things at a level that almost anyone can get something out of, and you don't miss out if you skip over the stuff that is beyond you.
For most of the talks, the math involved is only algebra and vectors. Although, the vectors are represented using bra-ket notation which may throw you off.
The main thing that's needed to follow along is familiarity and comfort with basic linear algebra.
Its very exciting when something comes along that seems genuinely magic. Its working and I haven't the faintest idea of how. I love magic!
- Different hardware. This boils down to different gates/primitive operations, ala . There are currently lots of different proposed methods of how to build the hardware, but I don't know that much about how the physical circuits in a traditional computer work, either.
- Different algorithms. See e.g. Shor's . I think there are only one or two others that have been proven to be more efficiently solvable by quantum means, so a lot more work needs to be done in this area.
The field is in its extreme infancy, and I think it will always be more challenging to program quantum computers by a fairly wide margin.
That article was not for the faint of geek.
Can someone explain to me what the hell the whole story is about?
D-Wave has attracted scepticism because in the many press releases they've issued over the last 14 years, they have repeatedly claimed to have made breakthroughs in quantum computing that would be perhaps 200 years before their time, if they weren't lying. Like a Chinese manufacturer saying they'll be selling an $11 phone next spring that has petabytes of ram, and can run a simulation of a human brain in real time, which the phone uses for accurate voice transcription.
D-Wave have finally, finally demonstrated a machine with a working qbit system. But the qbits are strongly coupled to the environment, and the machine runs many times slower than a classical electronic computer, even in the fake benchmarking problem that is the only thing the D-Wave machine can solve.
It is an important step on the road to usable quantum computing. It is not a general-purpose quantum computer. It can't be used for prime factorization, for instance.
When the Matrix was written the Lisp source code could not hope to compute the position of every quantum particle in the simulated universe in real time. So entanglement was invented as a means of linking quanta that were outside of each others relativistic cause-effect bubble thus reducing the number of actual positional calculations needed across the universe to a manageable amount.
However the perl code used to stich it altogether allowed entanglement in some cases within relativistic reach. The hope is that this will allow us to force a buffer overflow and introduce our own qubits to be processed on the Matrix substrate.
This will eventually lead to a sex party in a cave hundreds of miles below the surface. No quantum mechanics professors or students will be invited to the party
If you are able to get substantial numbers of "quantum bits" to stay entangled with one another, and hence behave in all those counterintuitive ways quantum things do, then you can (in principle) use the resulting machinery to perform some kinds of computations faster than any "conventional" computer can do them.
Making that actually happen is an enormous engineering challenge. No one's been able to do it with more than a very few bits, yet.
If they did, it would be a big deal: in particular, something called Shor's algorithm allows you (in principle) to factorize numbers efficiently on a quantum computer, and a big enough quantum computer would effectively break RSA encryption. As an indication of the progress that's been made in practical quantum computation: the first ever demonstration of Shor's algorithm in practice was in 2001 when some researchers at IBM managed to use it to find that 15 = 3x5; there was a major breakthrough in 2011, when the algorithm was used to find that 21 = 3x7.
Some other varieties of public-key encryption are not, so far as anyone currently knows, broken once we have quantum computers. But exactly what quantum computers are capable of is a very open question.
There's a company called D-Wave that, for years now, has been touting what they claim is a quantum computer of an unconventional design, very different from what most quantum computing researchers have been trying to do (or trying to analyse the capabilities of). They have claimed that their machine works with hundreds of (qu)bits, whereas no one else is using more than, say, ten. They have attracted a lot of media attention and a lot of money.
(Their machine allegedly does something called "adiabatic quantum computing", which somewhat resembles the non-quantum optimization process called simulated annealing. It may or may not actually be more powerful than simulated annealing.)
Until very recently, D-Wave (despite their great media success) had provided no evidence at all that their machine actually does anything "genuinely quantum", or that it is able to do anything that can't be done just as well with conventional classical computers costing much, much less than their machine.
Scott Aaronson (a young but already eminent researcher in the theory of quantum computation) has long been a leading critic of D-Wave, countering their hype (and that of those in the media who like their story) with patient skepticism and careful analysis.
A couple of years ago, D-Wave (for the first time) offered some actual evidence for actual quantum effects having some actual contribution to the behaviour of their machine. Aaronson's post about this -- http://www.scottaaronson.com/blog/?p=639 -- said (among other things) "I hereby announce my retirement as Chief D-Wave Skeptic".
WHAT'S GOING ON NOW
In the last few days there have been breathless reports in the media about how D-Wave's machine has been found to be thousands of times faster (at solving a single particular problem, the one it was designed to solve) than conventional computers. These reports have been discussed here on HN, too.
So Aaronson is back to debunking D-Wave hype. First, though, the good news: it does appear that this latest work gives some evidence that D-Wave's machine is genuinely doing something quantum. Specifically, some researchers have taken the same kind of problems that D-Wave's machine solves, and compared the performance of D-Wave's machine with (1) an algorithm called "quantum Monte Carlo", which is approximately a simulation (on conventional classical computers) of the particular quantum thing it's alleged to be doing and (2) a conventional computer doing ordinary simulated annealing. They found that the performance characteristics -- which problems are easier to solve and which harder, and by how much -- match up well between D-Wave's machine and the simulation of the quantum process it's meant to be an implementation of, whereas classical simulated annealing doesn't match at all well. So it does seem pretty likely that D-Wave's machine is doing roughly what D-Wave say it is, and that this truly is a quantum effect. Yay!
The bad news, part 1: This particular quantum phenomenon turns out to be one that can be efficiently and accurately simulated using ordinary classical computers. In other words, in so far as D-Wave's machine is really doing that, it offers no prospect of a more-than-constant-factor speedup relative to conventional, "non-quantum" digital computers. (The quotation marks are because actually semiconductors, as used in all integrated circuits, are fundamentally quantum devices. But they don't exploit quantum coherence in the sort of way quantum computers do.)
The bad news, part 2: At the same time as one researcher was comparing D-Wave's machine against a bunch of classical optimization algorithms and finding that D-Wave's machine performs much better, another researcher was comparing it against a different classical optimization algorithm, namely (you guessed it) simulated annealing -- and finding that simulated annealing actually solves the problems just as well as D-Wave's machine, but much faster and on cheaper hardware.
(For the avoidance of doubt, this is my summary of what Aaronson says; I think he is almost certainly right because he demonstrably knows his stuff, but I'm in no position to give any endorsement beyond that.)
D-Wave do seem to have a genuine quantum device. However, it doesn't seem to be a quantum computer in the sense of something that exploits quantum effects to do computation faster than a classical device can do by more than a constant factor, and the recent hype about their machine is very misleading.
[EDITED to fix a goof where I missed out half a sentence.]
Geordie's reply to Scott's blog: "The majority of that post is simply factually incorrect.
As one example, Troyer hasn’t even had access yet to the system Cathy benchmarked (the Vesuvius – based system). (!) Yes Rainier could be beat by dedicated solvers — it was really slow! Vesuvius can’t (at least for certain types of problems). Another is he thinks we only benchmarked against cplex (not true) and he thinks cplex is just an exact solver (not true). These types of gross misunderstanding permeate the whole thing.
If you want to know what’s really going on, listen to the real experts. Like Hartmut.
Please include this as a blog post on your blog, so that it does not get lost.
Thanks again for the explanation.
Furthermore, it doesn't look like the device solves those problems any faster than well-optimized code running on normal computers. Which cost about 10k times less.
They're essentially setting up a number of "qubits" (which is not related to "binary digit" other than they're both units of information) to naturally converge on the lowest energy state. All the qubits need to be entangled (fundamentally linked) in order for the system to work. This is a necessary but insufficient condition for quantum efficiency gains, which is the point Mr. Aaronson is making. The computer is not, according to recent evidence, providing computational gains. He then goes into the details, which you might or might not be interested in.
If you have other questions, please ask. I'm not an expert, but I understand his argument.
Why not "stoquastic Hamilsquonians?"
(there is a conjecture that anything remotely able to do an addition will be used for emulating a Nintendo console)
Quantum annealing is essentially a quantum algorithm for searching in O(n^0.5) (as opposed to classical computing's O(n) )
D-Wave is attempting to build hardware that can perform quantum annealing
The news is that D-wave managed to perform quantum annealing
The author is sceptical that this is as impressive as it sounds because there is no evidence D-Wave actually did quantum annealing with a O(n^0.5) algorithm, and in fact it appears that it was in fact slower than simulated quantum annealing (that is, a classical machine attempting the quantum algorithm, which of course can be no faster than O(n))
Some of the comments in this thread talked about it. (https://news.ycombinator.com/item?id=5697619)
I learned a bit, so it's not too bad. :-)
(Caveat: http://www.cs.virginia.edu/~robins/The_Limits_of_Quantum_Com... )
So is the recent D-Wave stuff just a combination of good PR and lazy journos, or are D-Wave being sleazy?
I guess it doesn't help that most people have no idea about quantum anything (let alone computing); or about massively parallel or P=NP etc.
Q: What's the difference between a computer salesman and a used car salesman?
A: The used car salesman knows when he's lying.
"It’s absolutely possible that adding active error correction might help at some point, maybe even in the next generation. If that is the case, we’ll certainly try anything anyone can think of to make the processors work better! In the specific case of exhibiting scaling differences over conventional algorithms, I’d bet we don’t need error correction (at least at the current snapshot) but at the next level (say 2000+ qubits) maybe we might. If we do, no problem — we’ll find a way!"
This is a relatively ill-understood discipline and a very fragile machine. 'Take it apart and look inside' won't tell you much about what happens when you turn it on. In that regard it might as well have been built by aliens. (Okay, it's probably not quite that bad.)
It's a certain kind of binary optimization problem that comes up in models of magnetic media ("Ising model"). In the original magnetic context, the two states are N and S. Each magnetic element within a 2D array of states jiggles around locally trying to align with its neighbors, and by doing so, each state-flip influences a global energy function.
The same problem comes up in image processing. The probabilistic equations describing segmentation of on-object versus off-object regions are the same mathematically as the magnetic energy function of the Ising model.
This particular problem is important in its niche, but it's really not that general. It's possible that even something as trivial (analytically) as going from 2 to 3 states will not generalize well to the D-wave hardware.
And it's possible that, by the time you transform a given problem (say, graph matching) to encode it in this model, you end up with either (a) something with more variables than the original problem, (b) something with exotic parameter settings that the D-wave hardware cannot handle, or (c) a model having an energy surface that is not well-suited to the particular D-Wave computational mechanism (annealing).
In some applications, the use of linear programming relaxations (I have not read the detailed paper, but I assume this is the CPLEX result discussed in the post) is much slower than annealing, and in others, LP relaxations are more competitive. Sometimes the LP relaxations give much better results, but they tend to be much, much, slower, for the Ising problem.
The D-Wave computer solves a problem. That problem defines the computer. It's a general problem (I don't know if it's turing complete), thus it's a gneral computer.
The problem is that a classical computer can solve the problem faster. In other words that means that a classical computer can emulate the D-Wave faster than the original hardware.
Typing it, I knew I ran the risk of the answer being circular, but I still had to ask.
1) D-Wave devices have finally been proven to meet a fundamental requirement to do quantum computing. This is a significant scientific advancement.
2) Despite this, they don't seem to answer any computational questions more quickly than commodity electronic computers.
Again, I'm not an expert. The author's reasoning does seem sound. Still, I'm not qualified to assess either of those claims directly.
Yes, and it's a necessary, but not sufficient requirement. They don't have a quantum computer by the usual meanning of that term, but they have a different kind of computer, that may or may not do something better than a normal computer.