
IBM will soon launch a 53-qubit quantum computer - Anon84
https://techcrunch.com/2019/09/18/ibm-will-soon-launch-a-53-qubit-quantum-computer/
======
reikonomusha
Alongside the development of quantum hardware, which has ever-growing qubit
numbers and ever-increasing fidelities and coherence times, there’s also
software that’s getting more and more sophisticated. Most quantum computer
providers have their own APIs for programming their machines [1,2,3]. Python
seems to be the popular choice for these APIs.

But one piece of software I’ve been particularly interested in, and indeed
contribute to, is Rigetti’s open source optimizing compiler called quilc
[4,4’], which is a sort of a GCC/Clang for quantum computers. It’s the only
open-source compiler of it’s kind. They also have a high-performance simulator
[5], which is used sort of like a debugger. (Perhaps surprisingly, both are
written in Common Lisp.)

What is very useful about the compiler is that it is retargetable, and can
compile for even IBM’s architectures of tomorrow. You just give it an “isa” at
the command line describing how many qubits it has, how they’re connected, and
which gates they support, and quilc will happily compile and optimize your
program for that architecture.

[1] Rigetti’s pyQuil
[https://github.com/rigetti/pyquil](https://github.com/rigetti/pyquil)

[2] IBM’s Qiskit
[https://github.com/Qiskit/qiskit](https://github.com/Qiskit/qiskit)

[3] Google’s Cirq
[https://github.com/quantumlib/Cirq](https://github.com/quantumlib/Cirq)

[4] quilc [https://github.com/rigetti/quilc](https://github.com/rigetti/quilc)

[4’] quilc at FOSDEM
[https://youtu.be/qwfMzzKJDXI](https://youtu.be/qwfMzzKJDXI)

[5] quantum virtual machine
[https://github.com/rigetti/qvm](https://github.com/rigetti/qvm)

~~~
imglorp
Common Lisp: The Connection Machine (Thinking Machines Corp) used a variant
called *Lisp. I wonder if that was an influence?

[http://www.softwarepreservation.net/projects/LISP/starlisp/s...](http://www.softwarepreservation.net/projects/LISP/starlisp/starlisp-
reference-manual-version-5-0.pdf)

~~~
1000units
This harkens back further to when Descartes wrote "Cogito, ergo sum", the
first recorded instance of a Lisp program. Amazing.

~~~
lispm
Earlier, the introduction of the Lisp Machine enabled the first Lisp
programmers Adam and Eve to abandon bra and fig-leaf.

------
Hongwei
I live/work in Waterloo and regularly ask a favour of the Quantum Nano Centre
folks at UWaterloo for a tour. Every time I look at the "computer," housed in
a gigantic facility where each qubit has its own liquid nitrogen cooling unit
and who knows what else plugged into it, and think to myself - I'm staring at
the vacuum tube of quantum computing.

Yes, you can scale up vacuum tube computers to solve meaningful tasks, but it
was barely more cost effective than human "calculators" except for specific
applications. I think the same analogy applies for modern day quantum
computers and classical computers.

~~~
kens
I think you're being too hard on vacuum tube computers. There were a lot of
vacuum tube computers and they were pretty successful. Electromechanical
computers like the Harvard Mark I would be a better comparison; they worked,
but were too expensive and slow to be more than one-off systems.

~~~
Hongwei
Great point, your analogy is better.

------
nickpeterson
I don't keep tabs on quantum computing at all, but I get the distinct feeling
that it isn't going to turn into anything important. Is there any reason to
believe this will be relevant to regular developers in the next 20 years?

~~~
hannob
No, absolutely not.

 _If_ quantum computers happen (which is still a big _if_ ) then mostly two
things will happen:

1\. Cryptographers will have to switch many of their algorithms and things
like TLS will get a bit slower and will need a bit more data transmitted.

2\. It's interesting for science, e.g. for simulations.

For the regular dev I'm pretty sure it won't have any meaningful impact.

Also even if quantum computers happen they won't be on our desks any time
soon. It'll probably more like "you can rent a QC for specialized tasks in the
cloud, but it will be relatively expensive and if you don't really need one
you'll use something else".

~~~
ganzuul
You should already have switched to AES256 or better. The data can be captured
and processed once a machine is available.

~~~
cyphar
The risk with quantum computers is with public-key cryptography (where all
modern PK crypto is broken with a sufficiently-powerful quantum computers). It
is believed that attacks with a quantum computer against symmetric-key
cryptography (like AES) gain no better than a square-root improvement (meaning
that doubling key sizes eliminates the risk).

~~~
ChrisLomont
>where all modern PK crypto is broken with a sufficiently-powerful quantum
computers

This isn't true - only methods based on Discrete Log are at risk, which
includes RSA and ECC, but there are a lot of PK algos that are not at risk,
like NTRU.

Here's a bunch of stuff related [https://en.wikipedia.org/wiki/Post-
quantum_cryptography](https://en.wikipedia.org/wiki/Post-quantum_cryptography)

~~~
cyphar
PQ-Crypto is still a very young research area. There is no widespread (or even
niche) use of PQ-Crypto algorithms, no consensus on how safe any of the
proposed algorithms are (and many proposals were found to be exceptionally
broken -- on classical hardware -- within weeks of their submission[1]). I'm
not really sure it's fair to say that just because there are some on-paper
algorithms (which have no usage outside academia _at the moment_ ) that might
solve the problem that it's false to say that "all modern PK crypto is broken
with quantum attackers".

But if you prefer, you could imagine that I said "all modern PK crypto [that
is actually in use today] is broken". Because, at the end of the day, that's
all that really matters.

[As an aside, for RSA the security is provided by the hardness of
factorisation which also breaks under QC attackers, not discrete log (it's
discrete log for DH, and EC discrete log for ECDH).]

[1]:
[https://www.youtube.com/watch?v=ZCmnQR3_qWg](https://www.youtube.com/watch?v=ZCmnQR3_qWg)

~~~
sweis
It not exactly a young research area. We have had post-quantum public key
cryptosystems based on linear codes for over 40 years.

Classic McEliece
([https://classic.mceliece.org](https://classic.mceliece.org)) was created in
1978 and is in the 2nd round of NIST's PQ-crypto contest (
[https://csrc.nist.gov/projects/post-quantum-
cryptography/rou...](https://csrc.nist.gov/projects/post-quantum-
cryptography/round-2-submissions)).

The PQCrypto conference started 13 years ago:
[https://pqcrypto.org/conferences.html](https://pqcrypto.org/conferences.html)

Chrome ran large scale experiments with NewHope support 3 years ago:
[https://security.googleblog.com/2016/07/experimenting-
with-p...](https://security.googleblog.com/2016/07/experimenting-with-post-
quantum.html)

------
nharada
Is there a generally accepted crossover point where quantum computers will
start outperforming classical computers on the same tasks? I was under the
impression it was somewhere around 100 qubits, but I'm sure there are caveats
around noise and stability as well, not just number of qubits.

~~~
WhitneyLand
Here's why IBM says quantum supremacy is not relevant

[https://www.quantamagazine.org/quantum-supremacy-is-
coming-h...](https://www.quantamagazine.org/quantum-supremacy-is-coming-heres-
what-you-should-know-20190718)

From the article

 _Quantum supremacy, we don’t use [the term] at all,_ said Robert Sutor, the
executive in charge of

IBM’s quantum computing strategy. _We don’t care about it at all._

The article also mentions practical considerations such as "quantum
advantage", fault tolerance, and other initiatives competing with IbM.

Relevant

[https://wikipedia.org/wiki/Quantum_supremacy](https://wikipedia.org/wiki/Quantum_supremacy)

[https://medium.com/@jackkrupansky/what-is-quantum-
advantage-...](https://medium.com/@jackkrupansky/what-is-quantum-advantage-
and-what-is-quantum-supremacy-3e63d7c18f5b)

~~~
The_rationalist
The only real usefulness of quantum computers would be the supposed quantum
supremacy. IBM not caring about Q supremacy is pure nonsense.

~~~
_red
>would be the supposed quantum supremacy

True. Another way he could have responded is: We don't care if quantum
computers actually do anything useful, only that we can use the PR to get
positive stock price movement.

~~~
jeffdavis
They could also say "We are scientists doing cool stuff, discovering things,
and having fun. Maybe it will be useful someday but we aren't really focusing
on that."

Why are you attributing some kind of cynical motive?

~~~
The_rationalist
Scientists having fun is nice. But the money could be better allocated at
critically useful research areas

~~~
ineedasername
Lots of very practical & critically useful things can come out the
foundational science & engineering required for this sort of research. Just
look at the practical applications of various technologies developed to get
the Large Hadron Collider up and running. [0]

[0] [https://kt.cern/cern-technologies-society](https://kt.cern/cern-
technologies-society)

~~~
The_rationalist
Nice try except quantum computing is not a fundamental science

~~~
ineedasername
It's hard to get more fundamental than studying how quantum properties &
effects can be manipulated. But call it what you will: You avoided the issue.
Whether or not you use the label "fundamental", the research can absolutely
yield practical & critical advancements. Plenty of necessary aspects &
engineering of the LHC would be unlikely to fall under your definition of
"fundamental", but nonetheless have are part of the benefits realized from
those efforts.

~~~
The_rationalist
After 40 years of applied quantum computing we have made no useful
discoveries. No surprise since it's applied. 40 years of LHG has yielded so
many useful discoveries. If we reallocated quantum computing funding to LHC
wouldn't that be more rational? And there are far more useful things to fund
than the already useful LHC.

~~~
ineedasername
A solid understanding of quantum physics, helped along by research with
quantum computing, is part of the foundation for much modern high technology.
Current quantum computing research is pioneering better techniques for noise
isolation and error correction, which has excellent practical potential as
well.

My point with the LHC isn't just that it has produced useful things, but that
we didn't know exactly that it would do so when it all started. Planning &
construction on projects that would eventually form part of the LHC began as
early as the late 70's, early 80's, roughly 40 years ago. That too was based
on research & engineering that started even earlier. So if we're comparing the
LHC and quantum computing, the tech behind the LHC has had more time to be
developed and refined. When quantum computing gets to the same level of
maturity, it may have just as many positive effects. Even so, arguing about
time lines is probably besides the point: different research can take longer
or shorter to have a beneficial impact.

Maybe there are other things that would benefit more from the same resources,
but it's also not something we can know in advance. Plenty of early promising
research is chased for years with no result. Some research takes decades for
its significance and practical impacts to be felt. We have to pursue a lot of
things or we'll never get much of anywhere.

I'm also curious though: The argument that quantum computing isn't doing
enough, fast enough, has some merit. It's not quite zero sum, but there is a
limited amount of money to go around, and maybe it could be better spent. What
would be your pet project to advance in place of quantum computing?
Personally, I'd want greater focus on energy research. I'm not looking to
criticize your choice, I'm genuinely curious.

------
equalunique
In 2018 at my local makerspace, one of the guys behind this[0] research did a
talk about "noise interference" in quantum computers. He did a great job of
explaining how conventional electronic components are unsuitable for the
precise and isolated requirements of quantum computing. What really left an
impression was the remark he made about error-checking. It was something like,
the then-current state-of-the-art qbit tech requires _eight_ qbits to error
correct every _one_ qbit to at the same level current x86 computers are error-
corrected, meaning that a viable quantum computer is a very, very long way
away. [0] [https://physics.aps.org/synopsis-
for/10.1103/PhysRevD.96.062...](https://physics.aps.org/synopsis-
for/10.1103/PhysRevD.96.062002)

~~~
q_eng_anon
Error detection vs. error correction are very different things. It's true that
many physical qubits (and a proper error correction scheme) are required to
implement a single logical qubit. But once a valid design for a single logical
qubit (device, error correction, etc.) is realized, the limits of scaling the
entire system change completely (infinite t1 of each qubit means you don't
NECESSARILY have to have all of your qubits on the same device...)

------
criddell
> Some of these have started to use the available machines to work on real-
> world problems, though the current state of the art in quantum computing is
> still now quite ready for solving anything but toy problems and testing
> basic algorithms.

Isn't that a contradiction?

~~~
foofoo55
s/now/not/

~~~
criddell
That's how it is in the article so I left it as-is. I tried to leave a message
about the typo, but it was asking me to create an account so I gave up.

------
jedberg
It's interesting to me how long this has taken to go from research to even the
most limited commercial applications. My friends were writing quantum computer
compliers and frameworks in grad school in the 90s. They had to start by
building a quantum computer simulator, since no QC existed yet to test their
stuff on.

I think however that once the hardware part is solved, the software part will
move quickly, since people have been working on the software since long before
the hardware existed.

~~~
legohead
my brother worked at Seagate in the 90s and would tell me about their research
into chemical memory and "holocubes", and how they felt it was just around the
corner.

yet here we are, still using the same old hard disks. we have SSDs now, at
least.

------
vamos_davai
I don't really understand quantum computing, and hopefully someone would shed
light and let me know if my understanding is wrong. Isn't a qubit similar to a
transistor that can hold more than 2 states and the implication is that it'll
be faster than current SIMD operations?

~~~
jfengel
A quantum computer doesn't work much like a classical computer at all, so any
analogy to transistors fails pretty quickly.

One key aspect of that is that all of the qubits are entangled. The qubit
isn't in more than two states, but in 2 states at once, and N qubits are in
2^N states at once. It's not quite as simple as that: you can't really just
set up any algorithm in a quantum computer the way you would with a classical
one. But for those problems amenable to quantum computation (including,
notably, prime factoring), they can be solved very fast.

With 2^53 qubits we should be able to factor some very, very large prime
numbers.

~~~
scottlocklin
You will find that this machine can't even factor the number 15 properly, as
it's not error corrected, or a general purpose quantum computer. FWIIW nobody
has factored the number 15 using quantum algorithms using the Shor algorithm
yet; only using the subset of gates they know produces the numbers 3 and 5.

~~~
cthalupa
You keep repeating this, but as best as I can tell, it hasn't been factually
accurate in a decade, unless you really want to quibble over the fact that
they finished the calculation on a general purpose computer.

[https://www.schneier.com/blog/archives/2009/09/quantum_compu...](https://www.schneier.com/blog/archives/2009/09/quantum_compute.html)

>Instead, it came up with an answer to the "order-finding routine," the
"computationally hard" part of Shor's algorithm that requires a quantum
calculation to solve the problem in a reasonable amount of time.

They weren't the first team to do this, either, they just miniaturized parts
of the hardware.

~~~
scottlocklin
The result you cite doesn't even say what you think it says; in fact it
directly says they didn't factor a damn thing.

If I'm wrong and IBM's latest "quantum computer" can actually do the Shor
algorithm on the number 15 using 4608 gate operations, I will publically eat
one of my shoes like Werner Herzog. Assuming I can get someone else to take
the other side of the bet, of course.

~~~
gjm11
It says:

> _the chip itself didn 't just spit out 5 and 3. Instead, it came up with an
> answer to the "order-finding routine"_

O noes! The quantum computer only did the "order-finding" part of Shor's
algorithm! ... But wait. Here's the Wikipedia page for Shor's algorithm:

> _Shor 's algorithm consists of two parts: 1. A reduction, which can be done
> on a classical computer, of the factoring problem to the problem of order-
> finding. 2. A quantum algorithm to solve the order-finding problem._

So, the article GP cited says that a quantum computer did _the part of
factoring the number 15 that actually uses a quantum computer_. Nothing wrong
with _that_.

The article's link to the actual paper is broken, which makes it harder to
tell whether as you say they cheated somehow, but here's another article about
factoring 15 with a quantum computer
[https://science.sciencemag.org/content/351/6277/1068](https://science.sciencemag.org/content/351/6277/1068)
which so far as I can see claims to have actually done the whole of (the
relevant part of) Shor's algorithm on actual QC hardware.

~~~
scottlocklin
Even assuming I humor you and refuse to notice they left out the gates to the
wrong answer, and assuming I humor the authors in agreeing that this is a
scalable form of the Shor algorithm (I deny both of these for the record) ....
congratulations: by 2016 someone was able to factor the number 15. I guess
quantum supremacy is right around the corner!

~~~
DebtDeflation
>to notice they left out the gates to the wrong answer

Can you explain this a little more? I've seen similar references to this
elsewhere. If it means what I think it means, it's kind of a really big deal.
But it may mean something else.

~~~
scottlocklin
[http://www.theory.caltech.edu/~preskill/pubs/preskill-1996-n...](http://www.theory.caltech.edu/~preskill/pubs/preskill-1996-networks.pdf)

~~~
gjm11
That doesn't explain in what sense this entirely different paper from 20 years
later allegedly "left out the gates to the wrong answer".

The paper I linked to says, e.g., the following:

> _Subsequent multipliers can similarly be replaced with maps by considering
> only possible outputs of the previous multiplications. However, using such
> maps will become intractable, [...] Thus, controlled full modular
> multipliers should be implemented._

So in at least one case they are explicitly _not_ taking a particular shortcut
because it doesn't scale to factorizing larger numbers. If you say they are
taking other shortcuts that don't scale by "leaving out the gates to the wrong
answer", then I think you owe us an _actual explanation_ of what shortcuts
they are taking and how you know you're taking them, rather than just a link
to a paper from 1996 that says how to take some shortcuts.

~~~
scottlocklin
Frankly I don't owe you or the muppets attempting to participate in this
thread by zombie-walking "MUH PRESS RELEASES" a damn thing: if you want to
believe in pixie dust, you're free to do so. None of the "quantum computing
results" factoring the number 15 have done the actual Shor algorithm -they've
all used the shortcut described in this paper. Someone below posted another
paper pointing out the same thing, as well as some discussion on a forum ....
pointing out the same thing.

It's not my fault you believe in press releases without understanding what
they mean.

~~~
gjm11
I haven't said a thing about press releases.

The paper I linked to (1) doesn't cite Preskill et al and (2) explicitly
claims _not_ to be taking shortcuts that don't generalize to numbers other
than 15; as well as the bit I quoted earlier, they say "for a demonstration of
Shor’s algorithm in a scalable manner, special care must be taken to not
oversimplify the implementation—for instance, by employing knowledge about the
solution before the actual experimental application" and cite an article in
_Nature_ decrying cheaty oversimplifications of Shor's algorithm.

I don't see anything in their description of what they do that seems to me to
match your talk of "deleting the gates that lead to the wrong answer".

(The Kitaev paper they cite also doesn't cite Preskill et al, unsurprisingly
since it predates that, and also doesn't contain anything that looks to me
like cheaty shortcut-taking.)

It is, of course, possible that that paper _does_ take cheaty shortcuts and
I've missed them. It is, of course, possible that its authors are flatly lying
about what they're doing, and trying to hide the evidence by not citing
important prior papers that told them how to do it. If so, perhaps you could
show us where.

Otherwise, I for one will be concluding from the surfeit of bluster and
absence of actual information in your comments so far that you're just
_assuming_ that every alleged "factoring of 15" is indulging in the dishonesty
you think they are, and that you aren't interested in actually checking.

(You don't, indeed, _owe_ anyone anything. It's just that if you want to be
taken seriously, that's more likely if you offer something other than sneering
and bluster.)

~~~
scottlocklin
Again, as I said before: even assuming they didn't take shortcuts (the paper
you mention is essentially "we took shortcut X instead of Y" -and of course,
no quantum error correction is evident): congratulations, your miracle
technology is now capable of factoring the number 15. All they have to do now
is add all the things that make quantum computing useful and eventually they
will honestly be able to factor the number 21. Should happen ... I dunno, care
to make a prediction when?

Pointing this out is apparently necessary; I don't know why it triggers people
so to point out that virtually the entire field up to the present day has
consisted of grandstanding quasi-frauds. And that you apparently have to
understand things and read extremely carefully to notice, because whatever
honest workers there may be don't see it as in their interest to point such
things out as it may upset their rice bowls.

You have someone else in another thread insisting that annealing can factor
giant prime numbers which is equally bullshit. Do you expect me to patiently,
precisely and (somehow) dispassionately point out every line of bullshit in
every quantum computing paper published? The mere fact that the field is
pervasive with bullshit, publishes papers and announcements that are known to
be bullshit, and promises all kinds of pixie dust bullshit on a regular basis
ought to give you some slight skepticism, some Bayesian prior that the grand
pronunciamentos of this clown car should be treated with a bit of skepticism.

~~~
gjm11
I agree that "can factor 15" isn't terribly impressive. (Well, it's kinda
impressive, because making a quantum computer do anything at all is really
hard.) I _very much_ agree that D-Wave's quantum annealing stuff is bullshit,
especially if anyone is claiming it's any good for factoring large numbers. I
don't expect you, or anyone, to point out every bit of bullshit in every paper
published.

I'm all in favour of pointing out bullshit. But there's a boy-who-cried-wolf
problem if you just indiscriminately claim that everything is the same kind of
bullshit without checking it.

Of course that doesn't mean that you're obliged to check everything. You can
say "I expect this is bullshit of the usual sort but haven't checked". But if
you say "this is bullshit of the usual sort" _without checking_ and it turns
out that that isn't the case (it _looks_ to me as if the paper I linked to
isn't the kind of bullshit you describe) then you take a credibility hit that
makes all your bullshit-spotting much less useful than if you were more
careful.

------
Causality1
So the big deal about quantum computers is how they're really fast at solving
certain very specific problems, but I've never seen numbers about how much
better at solving those problems they are than conventional computers. I
realize they work on completely different principles but it should be
hypothetically possible to make a normal computer solve the same problem even
if it were to take a decade to do what a quantum computer can do in a
microsecond.

So, how much faster are they? A million times faster? A trillion? A googol
times faster?

~~~
joak
It depends on the problem. Let's take the 2048-RSA integer factorization
problem.

Back of the envelope calculation:

The best classical algorithm takes 10^34 steps. The fastest computer on earth
_Summit supercomputer_ runs at 200 petaflops (2.5M cores, ~10^6x faster than
your desktop computer)

Actual quantum computers perform at 1 million operations/s per qubit

for 2048-RSA, Shor's algorithm needs 4096 non-noisy qubits (we are far from
that) and run in 10^7 steps

So for this particular problem:

Best classical computer (as of today): 1 billion years

An (hypothetical) quantum computer: ~10 seconds

Yeah, a trillion times faster...

~~~
Causality1
Interesting, thank you. It makes me wonder if we'll ever find a way to make
quantum computers do "easy" math as well as they do difficult math, if we'll
ever have general-purpose computers that do their arithmetic and rendering on
a quantum processor instead of a transistorized one. Is it even hypothetically
possible for a quantum computer to do something like render a web page using
less electricity than a conventional cpu?

------
doctoboggan
Question from someone who only knows what is in the popular press:

I've read articles that debate the existence of any quantum computers. Are we
past that now, or are there people out here that would question whether these
are real quantum computers?

~~~
krastanov
Nobody has claimed yet having anything that is a scalable error-corrected
quantum computer (or whoever has claimed it was never taken seriously). But we
are working hard on getting there, and this 53-qubit "quantum computer" is
indeed a machine that can compute things and it uses quantum effects (so it is
a "quantum computer"). It is not a "general" or "error corrected" or
"scalable" quantum computer. The comparison in sibling comments is very apt:
this, and other research hardwares of today, relates to general "useful"
quantum computers the way a bunch of vacuum tubes relate to the first
classical CPU.

------
danschumann
Seeing such a contraption makes me remember that "digital" stuff is actually
natural laws of electricity, and whatnot, being applied to computer. It also
makes me think of one day the biological computers that will come about, and
how people will think it's crazy to "make" a computer when you can just grow
one.

------
szczepano
I think scientists need other material for managing quantum state to get some
real progress.

------
rubyfan
What are the actual practical use cases for quantum computers right now?

~~~
bufferoverflow
Writing and selling books on how to program quantum computers.

------
galaxyLogic
When using a quantum computer, what is the act of "observation" that collapses
the qubits to 0s and 1s? Does the observer have to be a human?

~~~
fsh
How the measurement is done depends on the physical implementation of the
qubit. For example the state of the qubit could affect the resonance frequency
of a coupled microwave cavity which can be measured using RF techniques.
Obviously at some point the experimenter has to check the result of the
measurement, which makes the question what would have happened without a
conscious observer impossible to answer.

~~~
galaxyLogic
Very interesting. Then does the observation of one qubit collapse them all at
the same time, or is it possible to decide to read just one qubit at a time?

------
scottlocklin
I wonder if you can factor the number 15 on it yet. Actually I don't wonder; I
know for a fact that they can't do it.

~~~
q_eng_anon
lmao they did this 2 decades ago: [https://arxiv.org/pdf/quant-
ph/0112176.pdf](https://arxiv.org/pdf/quant-ph/0112176.pdf)

~~~
scottlocklin
No, actually they didn't. The Shor algorithm applied to n bits is 72 * n^3
applied gates; for 15 that's 4608 gates; and that's leaving out quantum error
correction which would require even more gates. This has never been done.

The result you cite: they don't even consider those early NMR "computers" to
be quantum in any meaningful sense. They're avogadro computers.

~~~
q_eng_anon
ooookay a more recent demonstration:
[https://arxiv.org/pdf/1804.03719.pdf](https://arxiv.org/pdf/1804.03719.pdf)

you said the number 15 was never factored on a quantum computer - this is
false.

You can run as many gates as you want - go ahead and run 4608 gates in qiskit
right now - the measurements will be random but you can do it.

IDK where you got that equation for the number of gates but it's probably for
the general case - doesn't take into account the fact that different gate sets
can be used to reduce the total.

Also confused on the error correction part - the whole point of error
correction is to make the coherence time independent of the number of 'gates'
in your circuit - so yeah with error correction you get more gates but you
also get an effectively infinite coherence time...

~~~
scottlocklin
>you said the number 15 was never factored on a quantum computer - this is
false.

I guess it depends what you mean by "factored" and "quantum computer" -using
the generally accepted definitions, the number 15 has never been factored on a
quantum computer using the Shor algorithm.

Yes, I am talking about the general case where you don't leave out the gates
for 2 and 7 being factors of 15. That's what most people mean by factoring.
Stating the answer because you know it already isn't useful. LARPing by
running the algo through the "right" gates also isn't useful.

~~~
codesushi42
You are just plain wrong.

And you failed to consider quantum annealing as an alternative to Shor.

~~~
scottlocklin
You're right I do fail to consider this, as it's not really an alternative to
Shor, because annealing is horse shit that nobody can decide the computational
complexity of. Not even D-wave thinks it might be. [1]

[1]
[https://www.nature.com/articles/s41598-018-36058-z](https://www.nature.com/articles/s41598-018-36058-z)

~~~
codesushi42
Complexity in what terms? For a classical computer?

Besides, the empirical data shows otherwise. It takes 12 qubits to factor 15.
We're up to 53 now.

With quantum annealing, a 20 bit number has been factored with 97 qubits. Not
on a real quantum computer yet, of course.

So I have no idea what you are talking about.

~~~
scottlocklin
> Not on a real quantum computer yet, of course...

Erm, OK. I guess we agree that nobody has factored the number 15 on a quantum
computer yet.

Maybe you should read the paper I helpfully linked you above.

~~~
codesushi42
Yes they have. You are spreading lies and FUD:
[https://www.google.com/amp/s/phys.org/news/2016-03-quantum-f...](https://www.google.com/amp/s/phys.org/news/2016-03-quantum-
factors-scaled.amp)

I was referring to the quantum annealing example, because no 97 qubit quantum
computer exists yet.

~~~
scottlocklin
I'm not spreading FUD; I am correcting misinformation from muppets whose
understanding doesn't go beyond press releases. Nobody has yet done a Shor
factorization of the number 15; the end, and _even if someone 's press release
says so_ there is no scalable way of factoring large integers.

~~~
codesushi42
There is. Quantum annealing. I think you're just trolling at this point. Or do
you not care for much reading?

~~~
scottlocklin
Quantum annealing will never be used for factoring prime numbers from large
integers, and hasn't even managed to factor 3 and 5 from 15. Even if you click
your heels together three times and wish for it really hard, it's not going to
happen.

Did you read the nature article, or just the press releases?

~~~
codesushi42
Uh huh. Did you?

 _Both methods requires 𝒪(log2(𝑁)) qubits in total, where N is the number to
be factored. The novelty of our demonstration of quantum annealing for prime
factorization is based on the reduction in quantum resources required to
execute factoring and the experimental verification of the algorithmic
accuracy using currently available hardware. As a proof-of-concept, we have
demonstrated these methods by factoring integers using the D-Wave 2000Q
quantum annealing hardware, but these methods may be used on any other quantum
annealing system with a similar number of qubits, qubit degree of
connectivity, and hardware parameter precision. Assuming that quantum
annealing hardware systems will continue to grow both in the number of qubits
and bits of precision capabilities, our methods offer a promising path toward
factor much larger numbers in the future._

And there is this too:
[https://link.springer.com/article/10.1007%2Fs11433-018-9307-...](https://link.springer.com/article/10.1007%2Fs11433-018-9307-1)

Are you just going to sit there and lob lame insults or do you have anything
meaningful to contribute?

~~~
scottlocklin
Even the first line of that paper is false.

 _" RSA cryptography is based on the difficulty of factoring large integers,
which is an NP-hard (and hence intractable) problem for a classical
computer."_

That is incorrect: there is no proof that factoring is NP-hard. Anyway, you
can hardly expect me to take anything they say after this seriously.

~~~
codesushi42
LMAO

I just... I can't.

------
ptah
what workloads would i run on these?

~~~
krastanov
On this particular one, probably only research code that tests various small
scale optimization algorithms and tests ways to "calibrate" the hardware. In a
couple of years, on the next gen of such hardware, the hope is to run things
like optimization algorithms and chemistry simulations that just begin to
outperform what is possible with classical hardware of reasonable size and
cost.

~~~
ptah
so it will eventually run stochastic gradient descent faster than GPU?

~~~
krastanov
The short answer is no. The long answer is much more interesting.

The premise of your question is a bit misleading. Quantum computers do not run
classical algorithms faster, rather they run algorithms specifically designed
to take advantage of the quantum coherence. For many tasks this does not
provide an advantage. For some very special (and occasionally useful) tasks,
the advantage seems to be exponential. Simulating quantum mechanics (as in
chemistry, material design, drug design) is one of these tasks. Some linear
algebra tasks are also faster. Lastly, some researchers are hopeful that novel
gradient-free optimizations algorithms exist for quantum hardware, but this is
a bit more controversial.

------
malms
"IBM"

Next

------
tpmx
Can existing quantum computers be considered evidence for parallel universes?

[https://physics.stackexchange.com/questions/164500/can-
exist...](https://physics.stackexchange.com/questions/164500/can-existing-
quantum-computers-be-considered-evidence-for-parallel-universes)

(tldr: noone knows)

~~~
knzhou
Existing quantum computers are evidence against a strawman of the Copenhagen
interpretation, which is "everything the size of an atom or smaller is
quantum, everything bigger is classical, and whenever the two touch, a
collapse occurs". This is of course silly and arbitrary, and has been known by
physicists to be false for nearly as long as quantum mechanics has been
around; there are plenty of quantum effects that involve objects bigger than
an atom. Quantum computers are simply another example.

Any genuine interpretation of QM will agree on what a quantum computer
outputs, because by definition, interpretations are ways of changing the
_words_ you drape around a calculation, not the result of the calculation
itself. So quantum computers don't support or oppose any genuine
interpretations. They only oppose strawmen.

