
Q – Initiative to build commercially available universal quantum computers - davidyapdy
https://www.research.ibm.com/ibm-q/
======
bvod
I tried testing the Bernstein Vazarani algorithm on a 5 qubit quantum computer
that IBM let us use for an afternoon for one of my university classes last
semester. It was unable to recover the hidden string, even though simulating
the circuit classically recovered it. Anyone can launch a "quantum computer as
a service" platform, but the service won't be valuable if the computer doesn't
work. And so far no quantum computer has solved a problem faster than we can
do classically

~~~
s_kilk
> but the service won't be valuable if the computer doesn't work.

So the state-of-the-art in quantum computing is still basically "can tell you
the 200th prime number, sometimes"?

Not very useful.

~~~
finid
But this is new territory, and researchers shouldn't give up just because it
doesn't work very well now.

I recall that the iPhone/iPad were preceded by attempts at tablet computing
that were very crude in comparison.

Give it a couple of years and see where it leads.

~~~
ekianjo
> I recall that the iPhone/iPad were preceded by attempts at tablet computing
> that were very crude in comparison.

That's a very poor analogy. Here we are talking about stuff that is not even
functional, technology wise.

~~~
finid
Poor poor analogy, but at least you get the picture.

They are talking 15- and 17-qubit-capable processors today, but are looking at
50-qubit ones in a few years.

~~~
AlexCoventry
The key question is whether QC algorithms are scaling as predicted by QC
theory, or at least better than classical algorithms for the same task. If the
error correction is causing them to scale poorly, the design is probably
infeasible for large-qubit calculations.

I expect it is scaling poorly, or IBM would have reported otherwise when
moving from 5- to 16-qubit machines.

------
cohomologo
The point where it gets interesting for realistic physics and chemistry
applications is around 100 (error-corrected) logical qubits and 10^8 coherent
operations, see for example
[https://arxiv.org/abs/1510.03859](https://arxiv.org/abs/1510.03859).

The error correction adds another factor of at least 100 or so in both qubits
and gates needed (but possibly much bigger than 100, depending on qubit
quality), see for example
[https://arxiv.org/abs/1312.2316](https://arxiv.org/abs/1312.2316).

Other fields of application - factoring large integers, for example - takes
many many more qubits to be interesting.

While it's good to get people excited about the potential of quantum
computing, it's seems a bit disingenious to suggest that a 17-bit quantum
processor is commercially interesting. I especially like how they juxtapose it
with the publically available 16-bit quantum processor to make it seem like
one extra qubit makes it worth paying money...

~~~
stil
Don't forget cryptography.

~~~
Arelius
That's covered by "factoring large integers", I do believe.

~~~
VMG
There is more to crypto than that, right? The discrete logarithm problem is
independent of integer factorization.

------
ssivark
1\. Do they claim that the backend is actually a quantum computer? Or are they
just providing a quantum computing like interface, which is backed by a
classical computer emulating a quantum computer?

2\. Looking at their Terms Of Service (excerpt below) it's unclear whether you
share rights any model that you try in their playground?

<i>"IBM does not want to receive confidential or proprietary information from
you through our Web site. Please note that any information or material sent to
IBM will be deemed NOT to be confidential. By sending IBM any information or
material, you grant IBM an unrestricted, irrevocable license to copy,
reproduce, publish, upload, post, transmit, distribute, publicly display,
perform, modify, create derivative works from, and otherwise freely use, those
materials or information. You also agree that IBM is free to use any ideas,
concepts, know-how, or techniques that you send us for any purpose. However,
we will not release your name or otherwise publicize the fact that you
submitted materials or other information to us unless: [...]"</i>

~~~
semi-extrinsic
How is it unclear? It seems pretty explicit from the passage you quote that
they can take anything you upload and do whatever they want with it.

------
cirgue
I know next to nothing about quantum computing, but something about this seems
extremely fishy. Had they developed a working quantum processor, would they
not have publicized its capabilities far and wide before releasing an
enterprise API? Failing at that, would there not be a benchmarks page that
demonstrates the capabilities of this system? Could someone better acquainted
with this technology weigh in?

~~~
moomin
Indeed, someone spends what AFAICT is a very small amount of his time
debunking every last claim that any given quantum computer does anything
straightforward simulated annealing can't do.

~~~
ZanyProgrammer
Scott Aaronson?

~~~
moomin
Sounds right.

------
cwyers
> While technologies like AI can find patterns buried in vast amounts of
> existing data, quantum computers will deliver solutions to important
> problems where patterns cannot be found and the number of possibilities that
> you need to explore to get to the answer are too enormous ever to be
> processed by classical computers.

Oh what a load.

~~~
sbue
There's some pretty cool problems in quantum systems like chemistry that even
a classical supercomputer would not be able to solve This could have a very
positive impact on fields like medicine

~~~
cwyers
I am not saying what quantum computing is or isn't doing. Contrasting quantum
computing and AI is like comparing... digging a hole and shovels. "While
digging a hole is useful, these shovels are even better than hole-digging!"
It's nonsense sauce. If quantum computing can be useful at AI tasks, then
quantum computing won't replace AI, we'll be doing AI on quantum computers.
This is just marketing mumbo jumbo to convince some C-suite type that doesn't
know a thing about data science to say, "Why are we still using AI and not
this quantum thing IBM has? Quantum is better than AI!"

~~~
ouid
It's a little funny that you're defending the integrity of calling the status
quo "AI", don't you think?

~~~
cwyers
Yes, yes, real AI died in the AI Winter because the settlers didn't have
enough parenthesis to last until spring, it was all very tragic, and now we're
all just stirring the pile of linear algebra until the results look good. [1]
But that doesn't make IBM's marketing copy any better. And IBM loves making
grandiose claims that don't work out in practice. [2]

[1] [https://xkcd.com/1838/](https://xkcd.com/1838/) [2]
[https://www.healthnewsreview.org/2017/02/md-anderson-
cancer-...](https://www.healthnewsreview.org/2017/02/md-anderson-cancer-
centers-ibm-watson-project-fails-journalism-related/)

------
emersonrsantos
Their product marketing surely is in a quantum state.

------
jwilk
What interesting computation can you do on a 16 qubit processor?

~~~
c0rruptbytes
you could factor an 8 bit number really fast heh

~~~
cohomologo
I think it's more like a 6 bit number, and probably not very fast at all...

------
Animats
That's been available for months. Has anyone used it?

~~~
greeneggs
A five-qubit processor has been available for a bit over a year now [1]. Last
week, they've announced that they'll make freely available a 16-qubit
processor, but for now it is invite-only. They've also announced that they'll
sell access to a 17-qubit device.

[1] [https://arstechnica.com/science/2016/05/how-ibms-new-five-
qu...](https://arstechnica.com/science/2016/05/how-ibms-new-five-qubit-
universal-quantum-computer-works/)

~~~
cubano
I was listening to a podcast yesterday where Google is basically "guaranteeing
a breakthrough" in QC by the end of the year.

Something about the 49-qubit threshold or some such proclamation, so maybe
that's something that will finally move the needle.

~~~
gadders
Was it this Podcast?
[http://www.bbc.co.uk/programmes/p052800h](http://www.bbc.co.uk/programmes/p052800h)

"IBM is giving users worldwide the chance to use a quantum computer; Google is
promising "quantum supremacy" by the end of the year; Microsoft's Station Q is
working on the hardware and operating system for a machine that will outpace
any conventional computer. Roland Pease meets some of the experts, and
explores the technology behind the next information revolution."

~~~
cubano
Yep...that was the one...further on in the broadcast the researcher from
Google explains that for "Quantum Supremacy", that "49-qubits" are enough to
"leave the fastest classical computer in the dust" or some such hype.

------
dmix
The explanation videos are very fluffy. Anyone find any technical overview
articles?

------
stedman
> All of this sophisticated engineering makes [the 17 qubit processor] at
> least twice as powerful as the [16 qubit processor].

Isn't an n+1 qubit processor _always_ twice as fast as an n qubit processor?

------
pjmlp
Looking at the SDK, the programming language of the future is Python.

------
dkarapetyan
16 qubits? How hard is it to simulate those 16 qubits with a regular computer?
I get this is marketing but people would be better off just running a
simulator at this point.

~~~
cohomologo
It's pretty easy. If your computer has enough ram to store a size 2^16 length
complex vector (which is 2^20 bytes, or 1MB) than you can open up an ipython
notebook and write code to apply quantum gates to it with no problem.

The problems start to set in if your RAM can't hold the wavefunction in memory
(so around 28 qubits, which takes 2^32 bytes = 4GB of RAM.)

With specialized code and supercomputers you can get a little farther, but you
will be fighting exponential growth, so not too much. The practical limit for
classical computers is in the 40-50 qubit range.

~~~
e12e
> If your computer has enough ram to store a size 2^16 length complex vector
> (which is 2^20 bytes, or 1MB)

One megabyte was enough even for quantum computing! Wow. Bill Gates, what a
visionary ;-)

------
addcn
IBMs Mo seems to be jump onto new technologies with name recognition. Make
BOLD claims they can't back up. See Watson. See Q. See IBM Blockchain.

This only serves to build distrust with developers but I bet their investors
love it, but only since they don't understand it

------
AlexCoventry
It seems that IBM is really struggling to stay connected to reality, let alone
run a profitable business.

~~~
jacquesm
Are you suggesting IBM is not profitable?

~~~
AlexCoventry
They seem to be running on inertia at this point. I haven't seen any plausible
new enterprises from them. It's possible I'm looking in the wrong places,
though.

~~~
jacquesm
Fully agreed, but they've been running on inertia for the last two decades,
but that has nothing to do with profitability. They're still printing money,
every quarter and are paying out substantial dividends.

Obviously that can't go on forever and they're desperately searching for a
path to a viable future but for the moment they are definitely in the black
and will - as far as I can see - stay there for quite a while to come.

------
davidgerard
This is horribly reminiscent of the marketing programme for IBM Blockchain.

------
dramm
Amazing this has not been renamed to Watson.

~~~
filereaper
I'm glad it isn't, it really waters down both brands...

