Hacker News new | comments | show | ask | jobs | submit login
Large-scale quantum chip validated (usc.edu)
87 points by scotty79 1489 days ago | hide | past | web | 36 comments | favorite

This is "validated" in a pretty weak sense.

I think the paper is (perhaps a slightly different version of) this one: http://arxiv.org/abs/1212.1739 in which the researchers found evidence that favours the hypothesis "the D-Wave device is doing quantum annealing" over the hypothesis "the D-Wave device is doing classical simulated annealing".

That's very interesting scientifically (though it's not clear to me how far they've ruled out other basically-classical processes) but it's important to notice what it isn't.

It isn't evidence that D-Wave's device can perform the operations usually denoted by the phrase "quantum computing". (So far as I know, no one thinks it can.) So, e.g., there is no known way to use it to break RSA encryption, no matter how well it does the things it does.

It isn't evidence that there is any problem D-Wave's device can actually solve faster than a classical computer.

It isn't evidence that there is any useful problem D-Wave's device can actually solve faster than a classical computer.

See http://www.archduke.org/stuff/d-wave-comment-on-comparison-w... for some comparisons between the published performance figures for D-Wave's device and simple software running on (one core of) an ordinary laptop. The laptop comes up faster every time, even solving the exact problem D-Wave's device is designed to solve.

That doesn't rule out the possibility that there may be other instances of that problem that D-Wave's device solves much faster than anything you can do on a laptop (but no one seems to have found any) nor the possibility that some future version of D-Wave's device may be much better because it scales better (though Alex Selby's figures aren't particularly encouraging on that score). But claims that D-Wave, now, have a useful quantum computer don't look very plausible.

> I think the paper is (perhaps a slightly different version of) this one: http://arxiv.org/abs/1212.1739 in which the researchers found evidence that favours the hypothesis "the D-Wave device is doing quantum annealing" over the hypothesis "the D-Wave device is doing classical simulated annealing".

> (though it's not clear to me how far they've ruled out other basically-classical processes)

According to my colleagues at IBM Research, not far:

> A pair of recent articles concluded that the D-Wave One machine actually operates in the quantum regime, rather than performing some classical evolution. Here we give a classical model that leads to the same behaviors used in those works to infer quantum effects. Thus, the evidence presented does not demonstrate the presence of quantum effects.


The USC group has a response to Smolin and Smith which explains how some quantitative features are still best explained by a simulated quantum annealer: http://arxiv.org/abs/1305.5837. So there's still some potential for evidence of quantum effects (though "maybe doing something nonclassical" is a far cry from all the marketing talk, especially since annealing with stoquastic Hamiltonions with a fixed topology is already a far cry from any known-to-be-useful quantum computing model).

Ooh, thanks. I think John and Graeme have been working on a second version, presumably in response to this. The battle continues...

The D-Wave is a quantum annealer not a multi-purpose quantum computer.

There's something i don't get with this debate around "is this quantum or classical" : i thought quantum computing meant breaking NP complexity. So, in order to determine if it "is" quantum computing, one would suppose that any big dataset would easily show the difference in computing time...

Now, if i understood correctly, the problem is that the algorithms compared (aka annealing) are of statistical nature, so we're not actually comparing "fully" NP complete algorithms, and so the expected difference is not as a big as between an O(n) and an O(x^n)) algorithm.

Could someone here confirm if this is correct ?

As I understand, the problem is comparing a relatively advanced technology (classical computing) to a relatively immature technology (quantum computing). So, classical computing is fast enough to simulate quantum annealing at the speed that the quantum chip itself is able to process (because the technology is so new). This has lead people to doubt whether the chip is actually doing the quantum process or simply simulating it. Or at least that is my grasp of the situation.

Quantum computing means breaking Bounded Quantum Polynomial, not NP. Integer factorization is in NP, and also in BQP; but some problems can just as well be in NP and not in BQP. And NP-complete is not a subset of BQP, as far as anyone knows.

Strictly, we don't actually know BQP is more powerful than P, right?

I've been thinking this for a while and have really come to believe it recently, but I'd be amazed if the NSA didn't have quantum computing down, either at scale or about to get there. Historically they've been, and similar organizations are perceived to be, 5-10 years ahead of public technology, so I'm going to go ahead and assume that all PKE is broken as far as the USG is concerned.

I've heard from someone who'd know that a big driver of Google's purchase of the D-Wave was that, "if there's going to be a crypto breakthrough, [Google] would like to know about it early."

You know, it's not impossible, but as someone who works in a field peripheral to quantum computing I would be surprised. There are two types of problems in science, the attrition kind where if you throw enough people and enough money at it you'll get it to work and the breakthrough kind where nothing will happen until some a-ha moment. I suspect quantum computing is more of the latter right now; there are just too many outstanding issues for a large-scale device. But who knows, I may be wrong.

Regarding D-Wave specifically, there's plenty of commentary around on what they're doing...

I'm not particularly well read on quantum computing, but it seems like we have reduced it to an engineering challenge at this point. We have already constructed and tested quantom computers with several quibits and (I am aware) of no theoretical limit to how many quibits we can have, so the problem is simply in engineering a device that can operate on all of them succesfully.

This is in contrast to most of the second type of breakthrough, where you typically have a theoretical revelation that opens previously closed doors.

> There are two types of problems in science, the attrition kind where if you throw enough people and enough money at it you'll get it to work and the breakthrough kind where nothing will happen until some a-ha moment.

Thats interesting. What camp would you say flight fell under?

Considering that it only took two guys to research,design, and build it I would say the latter with a lot of grunt work.

Google got in on this early:

> On February 13, 2007, D-Wave demonstrated the Orion system, running three different applications at the Computer History Museum in Mountain View, California. This marked the first public demonstration of, supposedly, a quantum computer and associated service.

> On Tuesday, December 8, 2009 at the Neural Information Processing Systems (NIPS) conference, a Google research team led by Hartmut Neven used D-Wave's processor to train a binary image classifier.


We can only hope quantum computers become commercially available as soon as possible, and many companies use them to encrypt the data and with perfect forward secrecy, that can at least provide some kind of protection against too many government abuses.

Because we're not going to be able to buy these ourselves in the next few decades, while the governments will be able to get as many as they want. So our only "hope" is that many companies can get them early, too, to protect their services against any type of "attacker".

Since quantum cryptography requires a special physical connection between the two parties, it wouldn't be particularly effective at securing the all-important SSL traffic. Luckily, I highly doubt practical use of Shor's algorithm will be possible publicly in 5-10 years, but even if it will, it's not necessary to use quantum cryptography to defend against quantum computers:


To emphasize what comex said: quantum cryptography is not an information theoretic defense against RSA-breaking quantum computers running Shor's algorithm. Not at all. It requires completely replacing the physical hardware connection between the two communicating parties.

The same argument can be used for flying saucers, because hell maybe in the next 100 years we will have flying saucers.

Anyways, my field is likewise peripheral to quantum computing (computational E&M) but having worked in HPC OEM the NSA buys lots of equipments that would be used for conventional password cracking. We would expect them to stop buying that stuff when they broke the speed-of-light and got a quantum computer.

What they most likely have are novels ways to do collisions on different algorithms. Such as the MD5 collision scheme used by the recent 2 US/Israeli viruses.

Does this mean that I need to throw away my 2048-bit RSA key? Is 4096 bit at all safer than 2048 from quantum computing?


No speed up when compared to a classical machine (for much cheaper) emulating the exact same procedure:


That depends...

The D-Wave processor probably isn't useful for cracking it. However, if you send RSA encrypted data, copies can be made of the encrypted data and cracked at a later date when better facilities or algorithms become available. If you wish to send a message that would make the NSA want to hunt you down and kill you if they decrypt it N years from now (and you plan to live >N years), you should probably be more careful. I don't think we have a good idea of what N is. It's probably pretty big, but it might not be.

I'm willing to bet that N is a single digit.

Anyone know how they program these?.. something like QCL? http://en.wikipedia.org/wiki/Quantum_programming

It doesn't run "programs" like a Turing or von Neumann machine does; it's not even a universal quantum computer. It solves certain specific instances of an optimization problem which, in the general case, is NP-hard.

Just to clarify, it's like a quantum ASIC?

The machine solves QUBO problem. You program in the Q_ij and a set of X's is returned as answer.


(Also equivalently and more commonly known as the Ising Model). The D-wave chip is just a very fast hardware-based Ising model solver. You can use it to both find the minimum of a problem through cooling, or by controlling the temperature of the system you can also do sampling. And since so many important problems can be reduced to an Ising model, this is interesting.

What I don't fully understand is why is it so important to show that the device takes advantage of Quantum effects functionally? A classical super-efficient Ising Model solver is interesting enough.

Also, the video in the article is really quite terrible. I only half understood it because I know what they were trying to say, but if I didn't I think I'd be very confused at best, and amusingly misinformed at worst.

Just to correct your first statement, in their recent article, they have ruled out idea that D-wave solves problems by cooling (otherwise known as simulated or classical annealing). Quantum annealing, a special case of Adiabatic Quantum Computing, is a completely different approach for solving Ising problems.

Unconstrained optimization is NP-hard problem, and quantum approaches won't change that. However, D-wave hopes they are faster in solving optimizations than classical solvers, something they haven't demonstrated yet. The fact whether D-wave is actually a Quantum machine is interesting to Computer Scientists and Physicist who care about accuracy of commercial hype, and also to D-wave's future customers.

Some prior paper demonstrated that a carefully crafter computer program was as fast as this machine, and I think they did not believe the Q system would get asymptotically faster.

Recommend reading: http://www.amazon.com/Quantum-Computing-since-Democritus-Aar...

Lots of interesting stuff about quantum computing and quantum algorithms in there. Some interesting tidbits:

    - var x; var y = x 
    - print(x) <--- Impossible, quantum information can't be duplicated. x no longer exists.

    - var y
    - var x = f(y) <--- The value of y is now changed. You must undo f with f' to return y to it's original state. 
I'm sure some QPL's will build this sort of stuff in so you don't have to worry about it, but definitely interesting reading~

(this is probably more directly relevant to programming though, and free: http://sneezy.cs.nott.ac.uk/qml/compiler/jjg-thesis.pdf)

As noted in the other thread on this story[1], this claim has already been debunked by other physicists[2].

[1] https://news.ycombinator.com/item?id=5957232 - comment https://news.ycombinator.com/item?id=5957686

[2] http://arxiv.org/abs/1305.4904

What's the coolest use-case for a quantum computer of the future?

Grover's algorithm allows you to search over an unordered data set of size n in O(n^(1/2)) time and O(log(n)) space.

I think if this really takes off we'll have a "quantum card" like a video card for a PC

And for what use? (sincerely interested).

Same use as a graphics card... to show off and play games.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact