
IBM Q system in development with working 50 qubit processor - babak_ap
http://www-03.ibm.com/press/us/en/pressrelease/53374.wss
======
darawk
The article appears to be saying that they are offering an _actual_ quantum
computer as a service, but as I recall, their previous offering was a
_simulation_ of a quantum computing environment. Yet, this same article refers
to that thing as if it were a real QC. This makes me skeptical that the thing
being offered here is a real quantum computer...Anyone here have any insight
into whether this is legitimate?

~~~
vtomole
Their previous offerings have been 2 physical quantum computers that can be
accessed over the cloud. One with 5 qubits, and another with 16 qubits[0].
These computers have had thousands of users who have run millions of quantum
algorithms [1].

[0]:
[https://www-03.ibm.com/press/us/en/pressrelease/52403.wss](https://www-03.ibm.com/press/us/en/pressrelease/52403.wss)

[1]:[https://www.engadget.com/2017/11/10/ibm-50-qubit-quantum-
com...](https://www.engadget.com/2017/11/10/ibm-50-qubit-quantum-computer/)

~~~
Xeoncross
Wait, I didn't think we had actual quantum computers yet.
[https://en.wikipedia.org/wiki/Quantum_computing](https://en.wikipedia.org/wiki/Quantum_computing)

~~~
vtomole
We've had quantum computers since the 90's. We just haven't had "useful"
quantum computers yet [0].

An implementation of quantum computers that is currently flourishing is one
that is built out of superconducting circuits. This makes it easier to scale
the quantum computer because you can leverage existing nanoelectronic
fabrication techniques [1].

[0]:
[https://en.wikipedia.org/wiki/Timeline_of_quantum_computing](https://en.wikipedia.org/wiki/Timeline_of_quantum_computing)

[1]:
[https://en.wikipedia.org/wiki/Superconducting_quantum_comput...](https://en.wikipedia.org/wiki/Superconducting_quantum_computing)

~~~
Xeoncross
> "We've had quantum computers since the 90's".

It wasn't until 2005 we had the first (probable) qubyte created at the
University of Innsbruck in Austria.

~~~
kevin_thibedeau
You don't need bytes to do computation.

~~~
colejohnson66
Wouldn't a qubyte be 8 qubits?

~~~
mtgx
The term doesn't make sense because you're not going to use groups of 8 qubits
to represent characters on a screen.

We may actually need a new word for 2 qubits (dual-qubits? dubits?) because it
seems 2 qubits are enough to break 1-bit of encryption, and I think I've read
it's enough to simulate 1 atom, too.

~~~
colejohnson66
I thought it was x qubits is equal to 2^x classical bits? So wouldn’t 2
“dubits” actually be _four_ times (not twice as) better?

------
JepZ
Presenting the pictures of a quantum computer with adbobe flash is an
impressive combination of technologies. I hope they are use a different
technology stack to develop the OS of that machine ;-)

~~~
IIAOPSW
Talking about the websites stack instead of the content is the HN equivalent
of bike-shedding. It sounds like a contribution but it isn't.

[https://en.wiktionary.org/wiki/bikeshedding](https://en.wiktionary.org/wiki/bikeshedding)

~~~
mwilliaams
Jeez man let him make a joke

------
nategri
Quantum computer news! Everyone is excited and confused.

------
erikj
Is this a true quantum computer? The controversy around D-Wave was confusing.

~~~
wmnwmn
They're definitely referring to a real ("universal") quantum computer, not the
quantum annealing devices of D-wave. Their computer will be able to run the
Schor algorithm, if it is as advertised. If they have 20 qubits in two months,
as they say they will, that seems like great progress compared to where I
thought the field was. And if they have 50 within a couple years after that,
it would be huge. One caveat is they seem to be making a distinction between a
"universal" quantum computer, and a "universal fault-tolerant" one. Fault-
tolerance requires a lot more qubits so it's not clear to me how valuable even
50 will be, if not error-corrected. I must say I will be kind of amazed if
quantum computers prove to be scalable the way classical ones have been. I'm
inclined to believe that the "Church-Turing" thesis will ultimately prevail
once the cost of construction of the machines is factored in (~linear growth
for the classical machines, greater than polynomial for the quantum). But I've
been wrong before.

~~~
andrewla
You say " _If_ they have 20 qubits in two months" and " _if_ they have 50
within a couple of years". Doesn't the article say that they have 50?

I'm unclear on this, because I have yet to see a published result talking
about running Shor's algorithm on a quantum computer beyond the NMR-based
physical simulation and the adiabatic one (which is just annealing). What
exactly is IBM claiming to have done here?

~~~
daxorid
They're simulating 50; they're implementing 20.

------
zamalek
What are they doing about error correction? Not that it makes this any less
impressive (everything starts _somewhere_ ), are they simply ignoring the
problem?

~~~
vtomole
It's a bit too early for quantum computers to do fault-tolerant (error
corrected) computation. This is because you need more than one physical qubit
to make a logical (error corrected) qubit. You need 7 physical qubits to
encode a logical qubit if you use the Steane code [0]. So with 50 qubits, they
could theoretically make 7 error corrected qubits.

[0]:
[https://en.wikipedia.org/wiki/Steane_code](https://en.wikipedia.org/wiki/Steane_code)

~~~
greeneggs
In fact, the smallest quantum error-correcting code using qubits uses only
five qubits. [1] So you could fit 10 of these codewords into 50 qubits.

But that's not right. At least two more qubits are needed for _fault-tolerant_
error correction. So that means you could fit nine codewords into 50 qubits
(since 9x5+2=47 < 50).

But that's not right. There exist more efficient codes that put multiple
encoded qubits into a single code block. [2] For example, three qubits can be
encoded into eight, still with distance three. [3] Six of these could fit into
50 qubits (since 6x8+2=50), giving 18 encoded qubits.

But that's not right. The IBM systems are superconducting qubits, with very
constrained interactions. Not every qubit can talk directly to every other
qubit. So you'd probably want every code block to have its own extra qubits
dedicated to error correction. If you need 8+2 qubits per code block, then you
could fit five code blocks, for 15 encoded qubits, into 50.

Obviously this is really complicated.

[1]
[https://en.wikipedia.org/wiki/Stabilizer_code#Example_of_a_s...](https://en.wikipedia.org/wiki/Stabilizer_code#Example_of_a_stabilizer_code)

[2]
[http://www.codetables.de/TableIII.php](http://www.codetables.de/TableIII.php)

[3]
[http://www.codetables.de/QECC.php?q=4&n=8&k=3](http://www.codetables.de/QECC.php?q=4&n=8&k=3)

~~~
jackfoxy
I recently had a conversation with a Microsoft quantum researcher, and this a
close approximation to his answer. I just wanted a number. It's complicated.

~~~
quickthrower2
Sounds like 15 is that number

~~~
cerved
It's also not 15 at the same time.

------
thibran
Does the appearance of quantum computers mean that the effectiveness of
programming languages becomes secondary and that ease-of-use/expressiveness
will be the most valued trait?

~~~
Zaak
No. Quantum computers are very specialized devices. They allow massive
speedups for certain applications, and no benefit at all for general-purpose
computing.

I expect to see a flourishing of quantum programming languages with which to
write code for quantum computers, but that's a different subject.

------
mtgx
Good job, IBM. You may actually be ahead of other tech companies for once,
although Google seems to be breathing down your neck in this area. Others seem
to be at least a generation or two behind.

One other thing to note is that until recently it was believed that 50-qubit
quantum computer would achieve "quantum supremacy". However, IBM itself has
shown that we can simulate 56-qubits on a classical supercomputer.

[https://www.ibm.com/blogs/research/2017/10/quantum-
computing...](https://www.ibm.com/blogs/research/2017/10/quantum-computing-
barrier/)

Now let's see who gets to the 100-qubit quantum computer first and achieves
quantum supremacy.

Going by recent developments, quantum computers seem to be following a
"Moore's Law" of sorts, where their number of qubits pretty much double every
two years. We need a few more generations to be sure of this, but it does look
like this is the rate at which they are going to evolve.

D-Wave, which isn't a universal quantum computer, has in fact been evolving at
2-4x every 2 years (closer to 2x for last few generations).

This is exciting because unlike classical computers, quantum computers
increase their performance by much more than 2x if their number of qubits
double every 2 years.

~~~
Cyph0n
> You may actually be ahead of other tech companies for once

IBM Research is and always has been an industrial research powerhouse. I doubt
you could name a company that has contributed so much to so many different
fields over the past few decades (Bell Labs is the universal exception).

From [1]:

"IBM Research's numerous contributions to physical and computer sciences
include the Scanning Tunneling Microscope and high temperature
superconductivity, both of which were awarded the Nobel Prize. IBM Research
was behind the inventions of the SABRE travel reservation system, the
technology of laser eye surgery, magnetic storage, the relational database,
UPC barcodes and Watson, the question-answering computing system that won a
match against human champions on the Jeopardy! television quiz show."

[1]
[https://en.wikipedia.org/wiki/IBM_Research](https://en.wikipedia.org/wiki/IBM_Research)

~~~
derwiki
> won a match against human champions on the Jeopardy! television quiz show.

I'm _much_ more impressed by Google's AI Go champion. And besides Watson, what
has IBM Research done in the past 20 years? Personally I would have thought
they'd be more involved in self-driving cars.

(Disclaimer: I interned for IBM Research 10 years ago)

~~~
Cyph0n
IBM isn't a software-only company: they do research in advanced materials and
semiconductor physics, low-level computer architecture, integrated circuit
testing, and so on.

When you look at the bigger picture, IBM Research has been doing amazing
things.

~~~
derwiki
What stands out to you in the last 20 years?

~~~
Cyph0n
IBM has been awarded more patents per year than any other company for the last
24 years[1]. In 2016 alone, IBM published ~22 patents per day, and ended up
being ~2500 patents ahead of Samsung (#2).

[1]:
[http://www-03.ibm.com/press/us/en/pressrelease/51353.wss](http://www-03.ibm.com/press/us/en/pressrelease/51353.wss)

~~~
foreigner
I was responsible for a couple of those parents and I can tell you that's all
nonsense. IBM employees are encouraged to patent _anything_, regardless of how
useless or silly it may be.

~~~
Cyph0n
Isn't that kind of the best strategy? You -- as a corporation -- want to make
investments for the future. Each patent is a possible income stream in the
future if the technology behind the patent somehow becomes "big". So it makes
sense to patent everything you can, just for the chance of one of them to go
"big".

Companies can't just spend money on research expecting no return, at least not
for a long time. That's what academia is built to do.

~~~
foreigner
No I mean I saw some patents on downright silly stuff. Things completely
unrelated to IBM's business or even tech. IBM just wants to wave that number
around.

------
tnash
At what point do we no longer trust public key cryptography (RSA)? Where's the
break point?

~~~
mtgx
> _It is estimated that 2048-bit RSA keys could be broken on a quantum
> computer comprising 4,000 qubits and 100 million gates. Experts speculate
> that quantum computers of this size may be available within the next 20-30
> years._

[https://www.entrust.com/wp-
content/uploads/2013/05/WP_Quantu...](https://www.entrust.com/wp-
content/uploads/2013/05/WP_QuantumCrypto_Jan09.pdf)

The paper is from 2009, so ~2030 to break 2048-bit RSA seems about right. If
they can double the number of qubits every two years, then we should have:

100-qubit by 2020.

200-qubit by 2022

400 qubit by 2024

800 qubit by 2026

1600 qubit by 2028

3200 qubit by 2030

6400 qubit by 2032.

It's also possible the rate of progress will be slightly higher than 2x every
2 years, so doing it a few years sooner than that is not out of the question.

Also, you have to consider that once you get a quantum computer that can break
2048-RSA, you'll be able to break all the encrypted communications you've
stored in the past few years, too. So you can't "switch-on" the quantum-
resistant crypto in 2031 and think you're all good. You have to do it as soon
as possible, especially after practical quantum computers that are capable of
scaling in a scheduled way start appearing (which seems to have happened).

Plus, even if Google is super-quick to adopt quantum-resistant crypto, doesn't
mean the rest of the internet will be, too. It could take a few more years for
that to happen, too.

~~~
marcosdumay
From the Wikipedia timeline[1], it seems to be growing linearly.

1 -
[https://en.wikipedia.org/wiki/Timeline_of_quantum_computing](https://en.wikipedia.org/wiki/Timeline_of_quantum_computing)

~~~
nategri
To quote my thesis advisor's terrible joke: "Everything's linear to first
order"

~~~
marcosdumay
Not too long ago I summarized every datapoint in that page. It is almost
perfectly linear (there's 1 qubit lower somewhere, 1 qubit higher elsewhere),
the noise is way lower than anything I was expecting to see.

------
jest7325
How many qubit does it take to crack an AES 192 key?

~~~
jcranmer
For AES, the quantum advantage is merely grover's algorithm, which allows you
to invert a function in O(sqrt(N)) time instead of O(N) time. Basically, the
algorithm would look like "compute AES-192 on a uniform random quantum state,
then do Grover's algorithm for 2^96 timesteps." This needs as many qubits as
it takes to compute AES-192 (which, since the algorithm isn't unitary, is
strictly greater than 192 qubits).

More important is the fact that it's a very long-running application: it
requires to you keep over 192 qubits coherent for a very long time, which is
probably an order of magnitude or more in error correction requirements.

------
rbanffy
I look forward to supercool (no pun intended) qSeries mainframes and whatever
wild OSs they'll run.

Right now, it looks like it's a batch processing like thing, with a single
problem using the machine at any given time with long setup/teardown times.

~~~
Zaak
I expect quantum computers will always run in batch mode. Storing quantum
state in classical memory is impossible, so task switching would be very
problematic.

~~~
rbanffy
We can try to partition the machine so your process can use some qubits and
mine uses others. I suppose a classical computer would be running the OS, at
least at first.

~~~
Zaak
Yeah, if you have a 5000-qubit computer and two people want to run 2000-qubit
jobs, I can see them being able to run simultaneously.

There will always need to be a classical computer running the quantum
computer.

~~~
rbanffy
> There will always need to be a classical computer running the quantum
> computer.

Not sure about the need, but it sure is convenient. Quantum computers are not
always better for all kinds of problems and being able to route different jobs
to different parts of the system should be an advantage. All this looks a lot
like a digitally controlled analog computer or something that can program
FPGAs on-the-fly.

~~~
Zaak
A digitally controlled analog computer is a good analogy.

One big limitation of quantum computers is that they can't erase information
without losing their quantumness. So simple things like if-then statements
become impossible (you'd have to execute both branches every time).

------
eric-hu
If quantum computing improves as it had been, how does this affect Bitcoin and
ethereum as we now know it? Could the blockchain be compromised by a malicious
actor who has much less than the majority?

~~~
bdamm
My prediction: at some point Bitcoin will go through an upgrade process to
transfer user wallets to a key pair based on a quantum resistant public key
algorithm. Users will be given a time frame to transmit their coins to the new
wallets, and after that time is expired, EC-based wallets will not be
considered valid transaction sources any longer. Classic cash conversion in
the digital age.

~~~
quickthrower2
That would be a fork, which is interesting. The recent BTC fork shows that
this can be politically successful so there is hope.

------
forapurpose
IBM also plans to offer a 20-qubit hosted service this year.

[https://techcrunch.com/2017/11/10/ibm-passes-major-
milestone...](https://techcrunch.com/2017/11/10/ibm-passes-major-milestone-
with-20-and-50-qubit-quantum-computers-as-a-service/)

------
jpalomaki
Are there some practical use cases for 50 qubit processor or is more off a
research thing?

------
powertower
I've been following quantum computing since D-Wave made its press release some
years back. Now I'm a complete skeptic.

The huge red flag I can't get over is if it is as so, why can no one validate
it after all this time?

Why is there the proverbial "it works but not in the way you think it works"
(i.e., quantum annealing) or "it works but we can use non-QM systems to
simulate it faster, better, cheaper by a factor of a trillion"?

If QM computing was truly feasible (assuming that QM does have an underlining
phenomena that is physically real), why are the results after all this time so
fuzzy?

~~~
Certhas
Because you have to build the quantum computer in a world that is
overwhelmingly classical.

There are no qualifiers along the lines of "underlying phenomena". It's simply
difficult to get a stable enough interface between the classical and the
quantum, so you can control it, while at the same time isolating it enough
that it doesn't decohere to classicality.

Who knows, maybe reliable scalable quantum computation truly isn't feasible
for some reason, but if you study the physics, the fact that this is so hard
is not really a surprise.

~~~
powertower
But they have already solved the engineering problems (at least 10 years ago).

They already have "qbits".

The interface issues look to be 98% solved.

And the temperature cooling, the EM shielding, and everything else (that is
outside the circuitry design and the physical chipset), a person with a budget
of 80,000 USD can recreated in his garage.

Its the results I can't understand.

Why can't X qbits, in the time they stay coherent, produce results that agree
with the mathematical analysis of the setup? Why is it always off by a factor
so large that its not even productive for any task.

My understanding of it is not complete, this is why I ask. Is the interface
issue only 2% solved (and not 98%), etc.?

------
petrikapu
Do you think this can f __k up RSA and render our industry obsolete?

~~~
submeta
The paper in the thread below claims that in ten years it might be the case:

[https://news.ycombinator.com/item?id=15674970](https://news.ycombinator.com/item?id=15674970)

------
Xeoncross
Quantum computing reminds me of time travel for all the same reasons. This is
not a quick win.

How would you even validate you built a quantum computer?

~~~
comicjk
Quantum computers can be simulated classically, in a timeframe exponentially
related to the number of qubits. So as long as the quantum computer is not too
big you can compare the results to an exact classical simulation and see if it
works. The current record for classical simulation is 56 qubits
([https://arxiv.org/abs/1710.05867](https://arxiv.org/abs/1710.05867)). This
is an active area of research, since we want to be able to check the early-
generation quantum computers carefully.

