I feel like I'm losing my mind, why are so much resources poured into quantum computing? Every use case I've ever seen cited are always hand-wavy stuff like "finance", "ecology", "chemistry". Throw in "Quantum AI" now also, whatever that means.
This chip still has an error rate of 0.15% for two-qubits gate, and the average connectivity is only ~3.5 qubits. Meaning, if you want to apply a CX to two qubits that are 10 links away you get an error rate of 1.5%. For this most basic of operation. This is like if everytime your computer did a NOT gate there was a 1.5% chance it didn't flip.
No useful applications, even theoretically. Hardware that is still decades away from anything remotely capable. Am I insane or is the world insane?
You do forget that the final result is a hybrid classical-quantum computer.
And certain applications like ML and quantum chemistry do allow a 1% error rate easy.
The simulated annealing is the actual quantum chemistry, as are the direct quantum simulations in these chips. And much like any basic research, there's just no way to predict what might happen. New solid state batteries are a good guess, interesting materials, fuel cells?
Potentially also improving on the reactor designs too.
RCS is used as a benchmark because it is slower on classical by so many orders of magnitude. But it's a useful thing, literally made for chip testing, opening gates (soc) to higher yields.
Then ultimately if it's big enough, it will start cracking classically and public key encryption forcing everyone to at least vastly upsize the keys, if not use resistant encryption.
But that's 1% per instruction. It compounds, exponentially. Can ML and quantum chemistry take a 99.5% error rate? Bacause with 1000 gates deep circuits that's what they'll get.
I can't talk about chemistry, but I've studied those quantum ML algorithms. I even designed one, a variant of the QSVM that I tested on IBM's computers when they were state of the art. They're all useless. There's just no way they'll compete with clusters of GPU in this century, I'd bet my hands on it.
First you'd need good hardware that doesn't just output noise for long enough circuits to do something interesting. Second you'd need computers fast enough to be able to run many experiments one after the other, so you can do actual training (Alas, speed is antithetical to adiabatic operations). Third, you'd need to prove your quantum AI actually has an edge over classical ones to justify cooling down this much hardware to 0.01K for months at a time.
I don't know. What I can see is that few people are paying attention. About four variants of this news has appeared in the last hour and yours is the only comment on any of them.
This chip still has an error rate of 0.15% for two-qubits gate, and the average connectivity is only ~3.5 qubits. Meaning, if you want to apply a CX to two qubits that are 10 links away you get an error rate of 1.5%. For this most basic of operation. This is like if everytime your computer did a NOT gate there was a 1.5% chance it didn't flip.
No useful applications, even theoretically. Hardware that is still decades away from anything remotely capable. Am I insane or is the world insane?
reply