Hacker News new | past | comments | ask | show | jobs | submit login
IBM will soon launch a 53-qubit quantum computer (techcrunch.com)
203 points by Anon84 on Sept 18, 2019 | hide | past | favorite | 166 comments

Alongside the development of quantum hardware, which has ever-growing qubit numbers and ever-increasing fidelities and coherence times, there’s also software that’s getting more and more sophisticated. Most quantum computer providers have their own APIs for programming their machines [1,2,3]. Python seems to be the popular choice for these APIs.

But one piece of software I’ve been particularly interested in, and indeed contribute to, is Rigetti’s open source optimizing compiler called quilc [4,4’], which is a sort of a GCC/Clang for quantum computers. It’s the only open-source compiler of it’s kind. They also have a high-performance simulator [5], which is used sort of like a debugger. (Perhaps surprisingly, both are written in Common Lisp.)

What is very useful about the compiler is that it is retargetable, and can compile for even IBM’s architectures of tomorrow. You just give it an “isa” at the command line describing how many qubits it has, how they’re connected, and which gates they support, and quilc will happily compile and optimize your program for that architecture.

[1] Rigetti’s pyQuil https://github.com/rigetti/pyquil

[2] IBM’s Qiskit https://github.com/Qiskit/qiskit

[3] Google’s Cirq https://github.com/quantumlib/Cirq

[4] quilc https://github.com/rigetti/quilc

[4’] quilc at FOSDEM https://youtu.be/qwfMzzKJDXI

[5] quantum virtual machine https://github.com/rigetti/qvm

> It’s the only open-source compiler of it’s kind.

There other compilers/transpilers out there, e.g., ScaffCC[1] and Qubiter[2]. Also inside IBM's framework you have one: Qiskit Terra[3]. You can find more in [4].

> What is very useful about the compiler is that it is retargetable.

Most, if not all, of them have this characteristic. Using Qiskit Terra you can definitely target Rigetti’s machines. You just need to provide the architecture specification, i.e., number of qubits and their coupling graph.

As a side note, one really annoying characteristic in Rigetti's quilc is the impossibility of turning off optimizations.

[1] https://github.com/epiqc/ScaffCC

[2] https://github.com/artiste-qb-net/qubiter

[3] https://github.com/Qiskit/qiskit-terra

[4] https://github.com/qosf/os_quantum_software

The operative phrase in that quote, and is by no means intended to be a technicality, is “of its kind.” In my opinion, quilc is in a class of its own, being fully automatic, retargetable, and optimizing, without the participation of an API or any sort of “setup” code. (This includes retargeting to every known gate set, not just CNOT or CZ.) Code goes in, code comes out, just like GCC or Clang.

Optimizations may be completely turned off with the directive PRAGMA PRESERVE_BLOCK [0]. Other optimizations can be selectively enabled and disabled with command line options or pragmas. But, as with GCC, the user does not have full control over every aspect of optimization and transformation, but rather just a set of coarse-grained controls. Otherwise, it wouldn’t be a compiler so much as a library/framework for program manipulation.

But one thing is true, quilc is intended to be fully automatic, not requiring (or even allowing) the user to list compilation “passes” or “transpilers”, which is typically what a software engineer expects when their primary concern is writing and running code. Many of the links you’ve listed are either “compilation frameworks” (like scaffcc, akin to LLVM), unitary factorizers (like qubiter, though I’m less familiar), and so on.

[0] http://docs.rigetti.com/en/stable/compiler.html

> Optimizations may be completely turned off with the directive PRAGMA PRESERVE_BLOCK [0]

This turn off compilation itself. If you read the documentation you referred me to, you will notice that code inside the directive won't be guaranteed to be legal QPU code after compilation.

> Many of the links you’ve listed are either “compilation frameworks” (like scaffcc, akin to LLVM), unitary factorizers (like qubiter, though I’m less familiar), and so on.

I don't know what you mean by "compilation framework" nor where you are drawing the line of what is a compiler, but ScaffCC is a compiler. It is built using llvm (so it is akin to Clang, not llvm) and compiles Scaffold to QASM.

Common Lisp: The Connection Machine (Thinking Machines Corp) used a variant called *Lisp. I wonder if that was an influence?


Part of lisp is just that it's really easy to sort of 80/20 rule you a compiler or DSL.

In new spaces that haven't been fully explored, lisp is awesome for letting you iterate on fundamentals of a language design before you put the work into a full language.

This harkens back further to when Descartes wrote "Cogito, ergo sum", the first recorded instance of a Lisp program. Amazing.

Earlier, the introduction of the Lisp Machine enabled the first Lisp programmers Adam and Eve to abandon bra and fig-leaf.

> It’s the only open-source compiler of it’s kind.


Microsoft Q# was open sourced this spring, I believe

That is a language and programming framework, not an optimizing compiler. (To my knowledge, it also doesn’t have any real quantum computer backends.)

Wow, they don't have a compiler for their "language and programming framework"? That's a pretty embarrassing oversight.

I live/work in Waterloo and regularly ask a favour of the Quantum Nano Centre folks at UWaterloo for a tour. Every time I look at the "computer," housed in a gigantic facility where each qubit has its own liquid nitrogen cooling unit and who knows what else plugged into it, and think to myself - I'm staring at the vacuum tube of quantum computing.

Yes, you can scale up vacuum tube computers to solve meaningful tasks, but it was barely more cost effective than human "calculators" except for specific applications. I think the same analogy applies for modern day quantum computers and classical computers.

I think you're being too hard on vacuum tube computers. There were a lot of vacuum tube computers and they were pretty successful. Electromechanical computers like the Harvard Mark I would be a better comparison; they worked, but were too expensive and slow to be more than one-off systems.

Great point, your analogy is better.

And much like vacuum tube computers were, early quantum computers are a necessary step towards the advancement of computing.

Computing is still advancing with silicon.

Great analogy!

Is there a generally accepted crossover point where quantum computers will start outperforming classical computers on the same tasks? I was under the impression it was somewhere around 100 qubits, but I'm sure there are caveats around noise and stability as well, not just number of qubits.

Here's why IBM says quantum supremacy is not relevant


From the article

Quantum supremacy, we don’t use [the term] at all, said Robert Sutor, the executive in charge of

IBM’s quantum computing strategy. We don’t care about it at all.

The article also mentions practical considerations such as "quantum advantage", fault tolerance, and other initiatives competing with IbM.




The only real usefulness of quantum computers would be the supposed quantum supremacy. IBM not caring about Q supremacy is pure nonsense.

>would be the supposed quantum supremacy

True. Another way he could have responded is: We don't care if quantum computers actually do anything useful, only that we can use the PR to get positive stock price movement.

They could also say "We are scientists doing cool stuff, discovering things, and having fun. Maybe it will be useful someday but we aren't really focusing on that."

Why are you attributing some kind of cynical motive?

Because it is a job, not a hobby, and is essentially about money. It's not cynical to ascribe venal motivations to employment, and to imply it is is a cheap shot.

It's foundational research, not product development. IBM does a lot of research that doesn't have immediate practical applications.

Scientists having fun is nice. But the money could be better allocated at critically useful research areas

Yikes, tough crowd.

Quantum computers are a kind of "because it's hard" problem. Some useful discoveries might come out of it by accident even if quantum computers never become practical.

And, uh, it's not your money, nor is it mine, so why complain? In the grand universe of wasted resources, quantum computers are probably not where I'd begin making cuts.

Lots of very practical & critically useful things can come out the foundational science & engineering required for this sort of research. Just look at the practical applications of various technologies developed to get the Large Hadron Collider up and running. [0]

[0] https://kt.cern/cern-technologies-society

Nice try except quantum computing is not a fundamental science

It's hard to get more fundamental than studying how quantum properties & effects can be manipulated. But call it what you will: You avoided the issue. Whether or not you use the label "fundamental", the research can absolutely yield practical & critical advancements. Plenty of necessary aspects & engineering of the LHC would be unlikely to fall under your definition of "fundamental", but nonetheless have are part of the benefits realized from those efforts.

After 40 years of applied quantum computing we have made no useful discoveries. No surprise since it's applied. 40 years of LHG has yielded so many useful discoveries. If we reallocated quantum computing funding to LHC wouldn't that be more rational? And there are far more useful things to fund than the already useful LHC.

A solid understanding of quantum physics, helped along by research with quantum computing, is part of the foundation for much modern high technology. Current quantum computing research is pioneering better techniques for noise isolation and error correction, which has excellent practical potential as well.

My point with the LHC isn't just that it has produced useful things, but that we didn't know exactly that it would do so when it all started. Planning & construction on projects that would eventually form part of the LHC began as early as the late 70's, early 80's, roughly 40 years ago. That too was based on research & engineering that started even earlier. So if we're comparing the LHC and quantum computing, the tech behind the LHC has had more time to be developed and refined. When quantum computing gets to the same level of maturity, it may have just as many positive effects. Even so, arguing about time lines is probably besides the point: different research can take longer or shorter to have a beneficial impact.

Maybe there are other things that would benefit more from the same resources, but it's also not something we can know in advance. Plenty of early promising research is chased for years with no result. Some research takes decades for its significance and practical impacts to be felt. We have to pursue a lot of things or we'll never get much of anywhere.

I'm also curious though: The argument that quantum computing isn't doing enough, fast enough, has some merit. It's not quite zero sum, but there is a limited amount of money to go around, and maybe it could be better spent. What would be your pet project to advance in place of quantum computing? Personally, I'd want greater focus on energy research. I'm not looking to criticize your choice, I'm genuinely curious.

Like what, better ad targeting for social media?

Like fighting ageing or ignorance.

There are reasons to believe quantum computers will be really great at optimization problems we would encounter in simulating the molecular machinery of cells.

Ah ? Interesting, a source would be welcome.

DWave builds machines they say are great for that kind of problem. I would assume, since they are not general purpose quantum computers, this represents a subset of what general purpose quantum machines would be good at.

https://arxiv.org/abs/0908.1921 looks like one such application.

Quantum supremacy is usually misunderstood, and it's likely quantum computers will not surpass transistor computers in every way; they will be much better at a subset of tasks only.

Right. And the "quantum computer" won't ever really exist anyway: we'll just end up with quantum functional units on regular computers. We attach GPUs and TPUs to regular CPUs, but we don't call the resulting machines "graphics computers" or "tensor computers".

Actually, the term "graphics workstation" isn't uncommon, and "gaming rig" is a very common term for machines with great big GPUs attached.

This threshold is commonly referred to as quantum supremacy. It could be that IBM just passed that threshold, but I'm by no means an expert on this.


> It could be that IBM just passed that threshold

I don't think they have just yet. There is a simulated QC on conventional hardware with around 50 qbits. We are close, though, it might still happen this year.

But even with 50 qubits, the amount of time per operation—if simulated classically—is at least a million+ times slower than what it would be for a quantum computer. This is like comparing disk to L1 cache. Does that count for supremacy? Mathematically, no. Does it count for some kind of advantage? Yes.

That point in time is, "tomorrow".

It should be at least more like 10000 according to this paper https://arxiv.org/abs/1805.05224 But this is in an utopian world where we neglect quantum error correction and stability and the fact that quantum computers are mostly non programmable (there's not even one qubit logic gate..)

TLDR quantum computing is going nowhere (excluding the low probability of a breakthrough in the field)

'non-programmable'... um what?

I don't keep tabs on quantum computing at all, but I get the distinct feeling that it isn't going to turn into anything important. Is there any reason to believe this will be relevant to regular developers in the next 20 years?

No, absolutely not.

If quantum computers happen (which is still a big if) then mostly two things will happen:

1. Cryptographers will have to switch many of their algorithms and things like TLS will get a bit slower and will need a bit more data transmitted.

2. It's interesting for science, e.g. for simulations.

For the regular dev I'm pretty sure it won't have any meaningful impact.

Also even if quantum computers happen they won't be on our desks any time soon. It'll probably more like "you can rent a QC for specialized tasks in the cloud, but it will be relatively expensive and if you don't really need one you'll use something else".

You should already have switched to AES256 or better. The data can be captured and processed once a machine is available.

The risk with quantum computers is with public-key cryptography (where all modern PK crypto is broken with a sufficiently-powerful quantum computers). It is believed that attacks with a quantum computer against symmetric-key cryptography (like AES) gain no better than a square-root improvement (meaning that doubling key sizes eliminates the risk).

>where all modern PK crypto is broken with a sufficiently-powerful quantum computers

This isn't true - only methods based on Discrete Log are at risk, which includes RSA and ECC, but there are a lot of PK algos that are not at risk, like NTRU.

Here's a bunch of stuff related https://en.wikipedia.org/wiki/Post-quantum_cryptography

With 'the unknown' I prefer to focus very strongly on where known and theoretically known truths are.

Based on your statement (I don't recall offhand), it might be said that it is widely believed RSA and ECC algorithms are vulnerable in a world where quantum computing is plausible. The safety of other systems would require a more rigorous proof; the absence of known vulnerability does not imply safety.

I am not an expert in this field, but the understanding I have from the last time I spent an afternoon reading about it might be summarized as:

* Known/Suspected vulnerable systems use "classically-hard" problems with vulnerability based on small key (or subkey) size.

* Most of the 'theoretically more secure' systems make the working set size large enough, and complex enough, to eclipse the scope of stable quantum computers.

* Thus, the two attack vectors to consider for 'post quantum' cryptography are currently unthinkably large systems or weaknesses in algorithms that allow for attacking a subset which does fit within a quantum system.

If my understanding/memory is incorrect updates would be very helpful.

The other systems are provably as secure on quantum computers as they are on classical computers. Quantum complexity classes are well mapped out, and it s known certain things cannot b one faster on quantum. Related to crypto, only systems reducible to DLP (well, technically Hidden Subgroup Problem but only for certain groups) are affected. This has the rigor of mathematical proof: it’s not handwaving.

And many systems do not reduce to this.

PQ-Crypto is still a very young research area. There is no widespread (or even niche) use of PQ-Crypto algorithms, no consensus on how safe any of the proposed algorithms are (and many proposals were found to be exceptionally broken -- on classical hardware -- within weeks of their submission[1]). I'm not really sure it's fair to say that just because there are some on-paper algorithms (which have no usage outside academia at the moment) that might solve the problem that it's false to say that "all modern PK crypto is broken with quantum attackers".

But if you prefer, you could imagine that I said "all modern PK crypto [that is actually in use today] is broken". Because, at the end of the day, that's all that really matters.

[As an aside, for RSA the security is provided by the hardness of factorisation which also breaks under QC attackers, not discrete log (it's discrete log for DH, and EC discrete log for ECDH).]

[1]: https://www.youtube.com/watch?v=ZCmnQR3_qWg

It not exactly a young research area. We have had post-quantum public key cryptosystems based on linear codes for over 40 years.

Classic McEliece (https://classic.mceliece.org) was created in 1978 and is in the 2nd round of NIST's PQ-crypto contest ( https://csrc.nist.gov/projects/post-quantum-cryptography/rou...).

The PQCrypto conference started 13 years ago: https://pqcrypto.org/conferences.html

Chrome ran large scale experiments with NewHope support 3 years ago: https://security.googleblog.com/2016/07/experimenting-with-p...

Shor factoring for RSA works by period finding, which is all modern PK crypto discrete log. Read the wiki page.

I wrote a decently cited paper summarizing all this when I worked in quantum computing research around 2005, on arxiv. Track it down if you’re interested. I’ve followed the field for decades, giving talks once in a while on it. So I do know a bit of the details.

The computational power of QC is very well studied. It is certain that it cannot crack certain (most) problems any faster than a classical computer. Thus algorithms not isomorphic to the few problems QC is better at are just as secure on both classical and quantum computers. There is consensus on this - there is no question about it.

>all modern PK crypto [that is actually in use today]

NTRU is an IEEE standard, and is in wolfSSL, which is on many platforms. NTRU is also used in many commercial products from chat to relays.

Other non DLP methods are also used in commercial products and standards.

> Shor factoring for RSA works by period finding, which is all modern PK crypto discrete log. Read the wiki page.

But is it not the case that the original Shor algorithm was specifically about using periodicity for integer factorisation (and the wiki page for "RSA problem"[1] also refers to integer factorisation, not discrete log).

> It is certain that it cannot crack certain (most) problems any faster than a classical computer.

Yes, I was already well-aware of that. My point was that the overwhelming majority of PK crypto used today depends on the hardness of problems that are exponentially accelerated by QC algorithms.

> NTRU is an IEEE standard, and is in wolfSSL, which is on many platforms.

Okay, maybe then it has some non-academic use. But the overwhelming majority of PK crypto in use today (TLS, PGP, the vast majority of E2EE systems) is not secure against quantum attackers. Can we agree on that at least?

[1]: https://en.m.wikipedia.org/wiki/RSA_problem

>But is it not the case that the original Shor algorithm was specifically about using periodicity for integer factorisation

Periodicity is discrete log.

Discrete log is defined as given a mathematical group G, a generator g in G, and an element h in G, find the power of g that gives h. This is a logarithm, over a discrete set, a finite group. This is equivalent to finding the power of h so h^r = 1, since given either, the extended euclidean algorithm gives an efficient way to get the other number. This number r is called the order of h, and there is no (in general) known classical efficient algorithm to do it.

This can be easy or hard, depending on the group.

Shor period finding is taking a the integers mod N (which is a group, called an abelian group, about the simplest class of groups), and an integer a, and finding the power r of a that is so a^r is 1 mod N. This is exactly the discrete log problem. You found the order of the element h.

Shor is a special, easy case of the more general problem - he solves it efficiently for Abelian cyclic groups, but it has been extended to many, many other groups.

In fact, researchers realizing this was exactly the general problem Shor solved is why all discrete log problems are vulnerable to QC. His method of using a quantum Fourier transform (which has a nice generalization to groups) to find period (which FFT does nicely) is directly applicable to any group. Shor is not some odd outlier algorithm; it is exactly the core (and almost the only) algorithm QC is exponentially faster at.

>But the overwhelming majority of PK crypto in use today (TLS, PGP, the vast majority of E2EE systems) is not secure against quantum attackers. Can we agree on that at least?

Yes, but that is a far cry from your original claim. We only use those systems because they are not widely broken, and like all crypto, if one gets broken, we simply replace. This is not new.

Each is easily replaced by algorithms that quantum cannot touch (unless classical can too), so much so, that many places already use QC resistant algorithms to prevent nation-states messing with them.

> only methods based on Discrete Log are at risk, which includes RSA and ECC

RSA's security is based on the presumption that factorization is hard, which is a different problem than finding the discrete log. To attack RSA you would use Shor's algorithm, and to attack ECC you would use Grover's.

See page 2 of this paper:


See my answer above. Shoes factoring is efficient because period finding mod N is exactly discrete log.

To attack ECC it is directly discrete log, where quantum is exponentially faster than classical.

Grover reduces unstructured search from O(N) to O(sort N), only a quadratic speedup, insufficient to weaken crypto more than 1 bit out of a key length, which is too weak to be a threat.

Page 2 of your link does not state how QC attacks problems; it dies say ECC is discrete log, though.

Both systems break precisely because QC solve DLP in these cases exponentially faster than classical, and this is all that’s needed.

> AES256 or better

I'm not aware of any symmetric cipher that the industry in broad consensus would consider "better" than AES-256.

But maybe then is already irrelevant?

Quantum computing won't be relevant to most regular developers, in the same way machine learning isn't relevant to most regular developers. Quantum computing might end up being useful for certain niche use-cases, such as heavy number crunching that can be made faster via quantum algorithms.

As to whether it will be usable within 20 years, time will tell.

And just as every company is trying to work out how to "use AI" they'll be trying to work out how to "use Quantum" in future, with all the typical buzzword bullshit that comes with it.

What is a "regular" developer then?

Maybe not to regular developers, but definitely to physicists[0], chemists[0][1], machine learning people[1], and cryptographers.

[0] - https://www.nature.com/articles/nature23879

[1] - https://ai.google/research/teams/applied-science/quantum/ - see 'Near-Term Applications'

Wouldn't the size different help with CPU density?

I don't really know much about the topic but my understanding is that it could potentially be more dense then the theoretic limit of current transistor sizes.

Definitely not. Quantum computing building blocks do not have anything that makes them inherently easier to miniaturize: quite the opposite, there is enormous amount of classical hardware necessary to support a single quantum bit. Moreover, some form of "isolation" between the qubits is necessary, so you can not just pack them together.

Current sizes of quantum bits: cavities are cm to mm scale; transmons and photonics can be much smaller, but nowhere near the size of modern transistors; ions need to be trapped in something and usually the scale is not better than the other options.

There is the hope that qubits can be made in silicon and use lithographic technologies, but then at best we will reach the density of current CPUs (but such a thing is science fiction at the moment).

It hard to know if that theory is going to translate into reality. The machinery required for isolating the qubits, configuring the qubits for a computation, and reading out the qubits at the end may prevent very much scaling.

Science fiction has hypothesized about quantum computing that involves macroscopically-sized bits of carefully constructed matter running a number of qubits best described with scientific notation, but it's not clear how one would set up problems or read anything out of such a thing.

Isnt this everyone who is coding these days? ;)

What about the vast legions of web developers?

I eagerly await the "quantum web".

Wait.. patent pending. There's probably an unreasonably large some of money to be made being able to sell that label to unwitting trend-hoppers.

I guess some consider those of us building programs that actually do something useful for a business purpose to be "fake programmers".

So other programmers don't build useful things? But websites are always useful?

I'm also very unaware of actual papers coming out of the field but it does seem like it'll serve similar purposes to FPGA type work, where if you need to optimize a certain workload and you have the budget, you'll probably offload some code paths to a quantum computer / expansion-card. I do get the feeling that the "regular developer" won't have to deal with it or think about it any time soon.

It depends on the number of qubits they will achieve. The larger the count, more relevant will it be. But if it stays low, the impact will be low...

Yeah I was impressed that they had a fleet of quantum computers this size. Turns out I'm well over a decade behind on my sense of how many qubits one should have...

What's a good number of qubits to have in order to outperform a classical computer in a useful application of quantum computing?

my understanding is a qubit is roughly the equivalent of a transistor, so we've got a long way to go. That said, "53-qubit" seemed like a weird number and strayed from my understanding, so take my comment as a starting point for more investigation.

A committee was involved somewhere.

53 bits is the significand in IEEE754 (Double precision FP).

I'm not sure about the next 20 years, but quantum computers are really good at running calculations in parallel and finding a result. I'm not 100% sure I'm accurate, but I believe if an algorithm is a good candidate to run on, say, a Spark cluster, a quantum computer would be much better at it.

I think the biggest change will be that quantum computing can really kill at machine learning, upping the numbers of nodes in neural networks with an exponential relationship to the number of qubits.

It might turn neural nets into a neat toy that Facebook uses to show us sort-of good ad predictions to something that really cuts into a wide range of things that only humans are capable of doing.

Quantum computers are not generically good at all parallel problems. They have better asymptotic runtime for “brute force search” type problems, where you know there is some answer but you have to guess-and-check repeatedly. Roughly they change a 2^n brute force algorithm to a 2^(n/2) algorithm in that case. The main area these problems are important is cryptography.

So what you're saying is I should short sell Bitcoin?

You can read more about this issue here: https://en.bitcoin.it/wiki/Quantum_computing_and_Bitcoin

The article says "Attacking Bitcoin keys would require around 1500 qubits". So with 28 of these, Bitcoin is compromised?

No, you need 1500 coherent qubits, in a single system.

Would it be possible to evalute n randomly generated neural nets to find the best one?

Maybe in theory, but you'd at least have to fit the entire neural net into the width of the computer. So we'd need a maching with millions of qubits? That's a bit hard to fathom.

> if an algorithm is a good candidate to run on, say, a Spark cluster, a quantum computer would be much better at it.

This is not true. Some parallelizable tasks will benefit from massive speed ups but most won't.

In 2018 at my local makerspace, one of the guys behind this[0] research did a talk about "noise interference" in quantum computers. He did a great job of explaining how conventional electronic components are unsuitable for the precise and isolated requirements of quantum computing. What really left an impression was the remark he made about error-checking. It was something like, the then-current state-of-the-art qbit tech requires eight qbits to error correct every one qbit to at the same level current x86 computers are error-corrected, meaning that a viable quantum computer is a very, very long way away. [0] https://physics.aps.org/synopsis-for/10.1103/PhysRevD.96.062...

Error detection vs. error correction are very different things. It's true that many physical qubits (and a proper error correction scheme) are required to implement a single logical qubit. But once a valid design for a single logical qubit (device, error correction, etc.) is realized, the limits of scaling the entire system change completely (infinite t1 of each qubit means you don't NECESSARILY have to have all of your qubits on the same device...)

I'd imagine that though error-checking is the biggest hurdle, we don't need quite the same level of error checking as an x86 computer for applications like breaking cryptography, since you can verify the result quickly on a classical computer.

> Some of these have started to use the available machines to work on real-world problems, though the current state of the art in quantum computing is still now quite ready for solving anything but toy problems and testing basic algorithms.

Isn't that a contradiction?

The first part means that anyone can sign up and run a program on a real quantum computer. The second part means that they're not powerful enough to be useful yet.


That's how it is in the article so I left it as-is. I tried to leave a message about the typo, but it was asking me to create an account so I gave up.


It's interesting to me how long this has taken to go from research to even the most limited commercial applications. My friends were writing quantum computer compliers and frameworks in grad school in the 90s. They had to start by building a quantum computer simulator, since no QC existed yet to test their stuff on.

I think however that once the hardware part is solved, the software part will move quickly, since people have been working on the software since long before the hardware existed.

my brother worked at Seagate in the 90s and would tell me about their research into chemical memory and "holocubes", and how they felt it was just around the corner.

yet here we are, still using the same old hard disks. we have SSDs now, at least.

I don't really understand quantum computing, and hopefully someone would shed light and let me know if my understanding is wrong. Isn't a qubit similar to a transistor that can hold more than 2 states and the implication is that it'll be faster than current SIMD operations?

It's weirder than that. Some quantum algorithms offer a big-O speedup over classical ones. Shor's algorithm is O(n), which is categorically faster than any known classical algorithm for integer factoring. It's polynomial time, whereas the fastest classical algorithm is subexponential. For every subexponential algorithm, there will be some n at which Shor's algorithm will be faster, and some slightly higher n at which it's much, much, much faster.


In no way am I qualified to talk about this with any sort of certainty, but I have been studying EE for the past 2 years and live in a house with a physicist who is probably far more well read in this topic. Classical computers are nice because they are predictable; if you want to order some product online, it would't be very helpful if the bit stream of data coming into your computer was fuzzy and muddled, making it hard to tell a 0 or a 1 apart. Quantum computing takes advantage of quantum properties such as the uncertainty principle and quantum entanglement to store bits in a state of being either a 0 or 1 until we "observe" the bit coming through (quantum bits -> qubit). This allows for algorithms, experiments, and other things quantum to be ran much faster than any classical computer since it can encode more data into a single bit. To minimize the fuzz in a q-computer, they freeze the sh*t out of it to keep the energy at a minimum, along with using conducting materials which excel at conducting heat away (like diamonds). I believe the spin of an electron is what determines the qubit value and when used in conjunction with quantum entanglement will affect other electrons that are already entangled. My little knowledge ends here but with this thought on quantum entanglement, you can see how flipping bits values respectively can allow for more ways to store information with the same amount of "stuff" remaining constant vs. a classical computer.

A quantum computer doesn't work much like a classical computer at all, so any analogy to transistors fails pretty quickly.

One key aspect of that is that all of the qubits are entangled. The qubit isn't in more than two states, but in 2 states at once, and N qubits are in 2^N states at once. It's not quite as simple as that: you can't really just set up any algorithm in a quantum computer the way you would with a classical one. But for those problems amenable to quantum computation (including, notably, prime factoring), they can be solved very fast.

With 2^53 qubits we should be able to factor some very, very large prime numbers.

You will find that this machine can't even factor the number 15 properly, as it's not error corrected, or a general purpose quantum computer. FWIIW nobody has factored the number 15 using quantum algorithms using the Shor algorithm yet; only using the subset of gates they know produces the numbers 3 and 5.

For those who are interested in the papers about this topic:


"Pretending to factor large numbers on a quantum computer (2013)"

"Of course this should not be considered a serious demonstration of Shor’s algorithm. It does, however, illustrate the danger in “compiled” demonstrations of Shor’s algorithm. To varying degrees, all previous factorization experiments have benefited from this artifice. While there is no objection to having a classical compiler help design a quantum circuit (indeed, probably all quantum computers will function in this way), it is not legitimate for a compiler to know the answer to the problem being solved. To even call such a procedure compilation is an abuse of language."

More references:


You keep repeating this, but as best as I can tell, it hasn't been factually accurate in a decade, unless you really want to quibble over the fact that they finished the calculation on a general purpose computer.


>Instead, it came up with an answer to the "order-finding routine," the "computationally hard" part of Shor's algorithm that requires a quantum calculation to solve the problem in a reasonable amount of time.

They weren't the first team to do this, either, they just miniaturized parts of the hardware.

The result you cite doesn't even say what you think it says; in fact it directly says they didn't factor a damn thing.

If I'm wrong and IBM's latest "quantum computer" can actually do the Shor algorithm on the number 15 using 4608 gate operations, I will publically eat one of my shoes like Werner Herzog. Assuming I can get someone else to take the other side of the bet, of course.

It says:

> the chip itself didn't just spit out 5 and 3. Instead, it came up with an answer to the "order-finding routine"

O noes! The quantum computer only did the "order-finding" part of Shor's algorithm! ... But wait. Here's the Wikipedia page for Shor's algorithm:

> Shor's algorithm consists of two parts: 1. A reduction, which can be done on a classical computer, of the factoring problem to the problem of order-finding. 2. A quantum algorithm to solve the order-finding problem.

So, the article GP cited says that a quantum computer did the part of factoring the number 15 that actually uses a quantum computer. Nothing wrong with that.

The article's link to the actual paper is broken, which makes it harder to tell whether as you say they cheated somehow, but here's another article about factoring 15 with a quantum computer https://science.sciencemag.org/content/351/6277/1068 which so far as I can see claims to have actually done the whole of (the relevant part of) Shor's algorithm on actual QC hardware.

Even assuming I humor you and refuse to notice they left out the gates to the wrong answer, and assuming I humor the authors in agreeing that this is a scalable form of the Shor algorithm (I deny both of these for the record) .... congratulations: by 2016 someone was able to factor the number 15. I guess quantum supremacy is right around the corner!

>to notice they left out the gates to the wrong answer

Can you explain this a little more? I've seen similar references to this elsewhere. If it means what I think it means, it's kind of a really big deal. But it may mean something else.

That doesn't explain in what sense this entirely different paper from 20 years later allegedly "left out the gates to the wrong answer".

The paper I linked to says, e.g., the following:

> Subsequent multipliers can similarly be replaced with maps by considering only possible outputs of the previous multiplications. However, using such maps will become intractable, [...] Thus, controlled full modular multipliers should be implemented.

So in at least one case they are explicitly not taking a particular shortcut because it doesn't scale to factorizing larger numbers. If you say they are taking other shortcuts that don't scale by "leaving out the gates to the wrong answer", then I think you owe us an actual explanation of what shortcuts they are taking and how you know you're taking them, rather than just a link to a paper from 1996 that says how to take some shortcuts.

Frankly I don't owe you or the muppets attempting to participate in this thread by zombie-walking "MUH PRESS RELEASES" a damn thing: if you want to believe in pixie dust, you're free to do so. None of the "quantum computing results" factoring the number 15 have done the actual Shor algorithm -they've all used the shortcut described in this paper. Someone below posted another paper pointing out the same thing, as well as some discussion on a forum .... pointing out the same thing.

It's not my fault you believe in press releases without understanding what they mean.

I haven't said a thing about press releases.

The paper I linked to (1) doesn't cite Preskill et al and (2) explicitly claims not to be taking shortcuts that don't generalize to numbers other than 15; as well as the bit I quoted earlier, they say "for a demonstration of Shor’s algorithm in a scalable manner, special care must be taken to not oversimplify the implementation—for instance, by employing knowledge about the solution before the actual experimental application" and cite an article in Nature decrying cheaty oversimplifications of Shor's algorithm.

I don't see anything in their description of what they do that seems to me to match your talk of "deleting the gates that lead to the wrong answer".

(The Kitaev paper they cite also doesn't cite Preskill et al, unsurprisingly since it predates that, and also doesn't contain anything that looks to me like cheaty shortcut-taking.)

It is, of course, possible that that paper does take cheaty shortcuts and I've missed them. It is, of course, possible that its authors are flatly lying about what they're doing, and trying to hide the evidence by not citing important prior papers that told them how to do it. If so, perhaps you could show us where.

Otherwise, I for one will be concluding from the surfeit of bluster and absence of actual information in your comments so far that you're just assuming that every alleged "factoring of 15" is indulging in the dishonesty you think they are, and that you aren't interested in actually checking.

(You don't, indeed, owe anyone anything. It's just that if you want to be taken seriously, that's more likely if you offer something other than sneering and bluster.)

Again, as I said before: even assuming they didn't take shortcuts (the paper you mention is essentially "we took shortcut X instead of Y" -and of course, no quantum error correction is evident): congratulations, your miracle technology is now capable of factoring the number 15. All they have to do now is add all the things that make quantum computing useful and eventually they will honestly be able to factor the number 21. Should happen ... I dunno, care to make a prediction when?

Pointing this out is apparently necessary; I don't know why it triggers people so to point out that virtually the entire field up to the present day has consisted of grandstanding quasi-frauds. And that you apparently have to understand things and read extremely carefully to notice, because whatever honest workers there may be don't see it as in their interest to point such things out as it may upset their rice bowls.

You have someone else in another thread insisting that annealing can factor giant prime numbers which is equally bullshit. Do you expect me to patiently, precisely and (somehow) dispassionately point out every line of bullshit in every quantum computing paper published? The mere fact that the field is pervasive with bullshit, publishes papers and announcements that are known to be bullshit, and promises all kinds of pixie dust bullshit on a regular basis ought to give you some slight skepticism, some Bayesian prior that the grand pronunciamentos of this clown car should be treated with a bit of skepticism.

I agree that "can factor 15" isn't terribly impressive. (Well, it's kinda impressive, because making a quantum computer do anything at all is really hard.) I very much agree that D-Wave's quantum annealing stuff is bullshit, especially if anyone is claiming it's any good for factoring large numbers. I don't expect you, or anyone, to point out every bit of bullshit in every paper published.

I'm all in favour of pointing out bullshit. But there's a boy-who-cried-wolf problem if you just indiscriminately claim that everything is the same kind of bullshit without checking it.

Of course that doesn't mean that you're obliged to check everything. You can say "I expect this is bullshit of the usual sort but haven't checked". But if you say "this is bullshit of the usual sort" without checking and it turns out that that isn't the case (it looks to me as if the paper I linked to isn't the kind of bullshit you describe) then you take a credibility hit that makes all your bullshit-spotting much less useful than if you were more careful.

God, you've got a bone to pick, don't you? I see you every time there's a HN thread about quantum computing. Scott Aaronsen posting under a sock account?

Unlike you, I post under my name. Aaronson doesn't know what he's talking about half the time.

Some people laugh at the chicanery of scumbags who claim fully autonomous vehicles are right around the corner. I laugh at the frauds and mountebanks of "quantum information theory" and the muppets who believe everything they say. De Gustibus.

I changed my mind, you called someone a bugman in another thread, so while you're wrong about the current state of QC you're alright in my books

An explanation in comic form with text by quantum computing researcher Scott Aaronson: https://www.smbc-comics.com/comic/the-talk-3

QC is about setting up interference patterns between the qbits, it is fundamentally unlike classical computing. For some problems algorithms can be designed that use those interferences to compute things, like the famous Shor's algorithm for polynomial time prime factoring.

QC speeds things up only in the cases where an asymptotically faster algorithm can be designed this way, it is not a general purpose parallelization mechanism.

And we don't know many things it speeds up for sure! For example: we don't actually know that there isn't a classical polynomial time prime factors algorithm we haven't found. Here is a recent example of a quantum algorithm leading to the discovery of a classical algorithm that is equivalent to the quantum: https://www.quantamagazine.org/teenager-finds-classical-alte...

> Isn't a qubit similar to a transistor that can hold more than 2 states ...

Not really. From an information theory perspective, holding two states is the most efficient way of computing. Or at least, there is no speed-up gained from doing computations in any other base than 2.

Quantum computers can be in both states at the same time (as far as that interpretation of quantum mechanics goes). So, if you keep the qubits all in this superposition state, you can calculate multiple things at the same time.

>From an information theory perspective, holding two states is the most efficient way of computing.

Assuming the cost of operations on each digit is a multiple of its number of states, the most efficient base for computing with arbitrary numbers is 3, which is the closest integer to the optimal base e:



Yes, you are correct. I said it all wrong.

From a physical perspective, holding two states is the most efficient way of computing. I.e. the electronics required to compute in bases other than 2 involve higher power and present difficult challenges.

Since other people have already explained how qubits are not at all similar to transistors, I'd like to mention that it is not comparable at all to SIMD and there is no indication that quantum computers will be better at vector math than classical computers.

I found this video to be a great explanation https://www.youtube.com/watch?v=OWJCfOvochA

Explains to 5 different levels of CS knowledge.

The gist of it is:

>> that can hold more than 2 states

--> that can hold more than 2 states AT THE SAME TIME (in weird quantum conditions)

So you're not trying 00, 01, 10, 11; you're trying [00|01|10 |11] at the same time, as well as other state in between 0 and 1.

So the big deal about quantum computers is how they're really fast at solving certain very specific problems, but I've never seen numbers about how much better at solving those problems they are than conventional computers. I realize they work on completely different principles but it should be hypothetically possible to make a normal computer solve the same problem even if it were to take a decade to do what a quantum computer can do in a microsecond.

So, how much faster are they? A million times faster? A trillion? A googol times faster?

It depends on the problem. Let's take the 2048-RSA integer factorization problem.

Back of the envelope calculation:

The best classical algorithm takes 10^34 steps. The fastest computer on earth _Summit supercomputer_ runs at 200 petaflops (2.5M cores, ~10^6x faster than your desktop computer)

Actual quantum computers perform at 1 million operations/s per qubit

for 2048-RSA, Shor's algorithm needs 4096 non-noisy qubits (we are far from that) and run in 10^7 steps

So for this particular problem:

Best classical computer (as of today): 1 billion years

An (hypothetical) quantum computer: ~10 seconds

Yeah, a trillion times faster...

Interesting, thank you. It makes me wonder if we'll ever find a way to make quantum computers do "easy" math as well as they do difficult math, if we'll ever have general-purpose computers that do their arithmetic and rendering on a quantum processor instead of a transistorized one. Is it even hypothetically possible for a quantum computer to do something like render a web page using less electricity than a conventional cpu?

Question from someone who only knows what is in the popular press:

I've read articles that debate the existence of any quantum computers. Are we past that now, or are there people out here that would question whether these are real quantum computers?

Nobody has claimed yet having anything that is a scalable error-corrected quantum computer (or whoever has claimed it was never taken seriously). But we are working hard on getting there, and this 53-qubit "quantum computer" is indeed a machine that can compute things and it uses quantum effects (so it is a "quantum computer"). It is not a "general" or "error corrected" or "scalable" quantum computer. The comparison in sibling comments is very apt: this, and other research hardwares of today, relates to general "useful" quantum computers the way a bunch of vacuum tubes relate to the first classical CPU.

Seeing such a contraption makes me remember that "digital" stuff is actually natural laws of electricity, and whatnot, being applied to computer. It also makes me think of one day the biological computers that will come about, and how people will think it's crazy to "make" a computer when you can just grow one.

I think scientists need other material for managing quantum state to get some real progress.

What are the actual practical use cases for quantum computers right now?

Writing and selling books on how to program quantum computers.

When using a quantum computer, what is the act of "observation" that collapses the qubits to 0s and 1s? Does the observer have to be a human?

How the measurement is done depends on the physical implementation of the qubit. For example the state of the qubit could affect the resonance frequency of a coupled microwave cavity which can be measured using RF techniques. Obviously at some point the experimenter has to check the result of the measurement, which makes the question what would have happened without a conscious observer impossible to answer.

Very interesting. Then does the observation of one qubit collapse them all at the same time, or is it possible to decide to read just one qubit at a time?

As I understand it "observation" is any interaction that leads to decoherence. No complex neural wetware required.

So no Schrodinger's Cat? The cat is the observer!

Sounds natural and obvious, so why haven't I heard this solution to Schrodinger's Cat many times before? Why is Schrodinger's Cat still considered a problem worth discussing ?

Schrodinger's Cat is Schrodinger's way of saying Copenhagen interpretation is probably wrong I think. That was an absurd example that got a life of its own.

It's actually done by little gnomes inside the qubits.


I wonder if you can factor the number 15 on it yet. Actually I don't wonder; I know for a fact that they can't do it.

lmao they did this 2 decades ago: https://arxiv.org/pdf/quant-ph/0112176.pdf

No, actually they didn't. The Shor algorithm applied to n bits is 72 * n^3 applied gates; for 15 that's 4608 gates; and that's leaving out quantum error correction which would require even more gates. This has never been done.

The result you cite: they don't even consider those early NMR "computers" to be quantum in any meaningful sense. They're avogadro computers.

ooookay a more recent demonstration: https://arxiv.org/pdf/1804.03719.pdf

you said the number 15 was never factored on a quantum computer - this is false.

You can run as many gates as you want - go ahead and run 4608 gates in qiskit right now - the measurements will be random but you can do it.

IDK where you got that equation for the number of gates but it's probably for the general case - doesn't take into account the fact that different gate sets can be used to reduce the total.

Also confused on the error correction part - the whole point of error correction is to make the coherence time independent of the number of 'gates' in your circuit - so yeah with error correction you get more gates but you also get an effectively infinite coherence time...

>you said the number 15 was never factored on a quantum computer - this is false.

I guess it depends what you mean by "factored" and "quantum computer" -using the generally accepted definitions, the number 15 has never been factored on a quantum computer using the Shor algorithm.

Yes, I am talking about the general case where you don't leave out the gates for 2 and 7 being factors of 15. That's what most people mean by factoring. Stating the answer because you know it already isn't useful. LARPing by running the algo through the "right" gates also isn't useful.

You are just plain wrong.

And you failed to consider quantum annealing as an alternative to Shor.

You're right I do fail to consider this, as it's not really an alternative to Shor, because annealing is horse shit that nobody can decide the computational complexity of. Not even D-wave thinks it might be. [1]

[1] https://www.nature.com/articles/s41598-018-36058-z

Complexity in what terms? For a classical computer?

Besides, the empirical data shows otherwise. It takes 12 qubits to factor 15. We're up to 53 now.

With quantum annealing, a 20 bit number has been factored with 97 qubits. Not on a real quantum computer yet, of course.

So I have no idea what you are talking about.

> Not on a real quantum computer yet, of course...

Erm, OK. I guess we agree that nobody has factored the number 15 on a quantum computer yet.

Maybe you should read the paper I helpfully linked you above.

Yes they have. You are spreading lies and FUD: https://www.google.com/amp/s/phys.org/news/2016-03-quantum-f...

I was referring to the quantum annealing example, because no 97 qubit quantum computer exists yet.

I'm not spreading FUD; I am correcting misinformation from muppets whose understanding doesn't go beyond press releases. Nobody has yet done a Shor factorization of the number 15; the end, and even if someone's press release says so there is no scalable way of factoring large integers.

There is. Quantum annealing. I think you're just trolling at this point. Or do you not care for much reading?

Quantum annealing will never be used for factoring prime numbers from large integers, and hasn't even managed to factor 3 and 5 from 15. Even if you click your heels together three times and wish for it really hard, it's not going to happen.

Did you read the nature article, or just the press releases?

Uh huh. Did you?

Both methods requires 𝒪(log2(𝑁)) qubits in total, where N is the number to be factored. The novelty of our demonstration of quantum annealing for prime factorization is based on the reduction in quantum resources required to execute factoring and the experimental verification of the algorithmic accuracy using currently available hardware. As a proof-of-concept, we have demonstrated these methods by factoring integers using the D-Wave 2000Q quantum annealing hardware, but these methods may be used on any other quantum annealing system with a similar number of qubits, qubit degree of connectivity, and hardware parameter precision. Assuming that quantum annealing hardware systems will continue to grow both in the number of qubits and bits of precision capabilities, our methods offer a promising path toward factor much larger numbers in the future.

And there is this too: https://link.springer.com/article/10.1007%2Fs11433-018-9307-...

Are you just going to sit there and lob lame insults or do you have anything meaningful to contribute?

Even the first line of that paper is false.

"RSA cryptography is based on the difficulty of factoring large integers, which is an NP-hard (and hence intractable) problem for a classical computer."

That is incorrect: there is no proof that factoring is NP-hard. Anyway, you can hardly expect me to take anything they say after this seriously.


I just... I can't.

what workloads would i run on these?

On this particular one, probably only research code that tests various small scale optimization algorithms and tests ways to "calibrate" the hardware. In a couple of years, on the next gen of such hardware, the hope is to run things like optimization algorithms and chemistry simulations that just begin to outperform what is possible with classical hardware of reasonable size and cost.

so it will eventually run stochastic gradient descent faster than GPU?

The short answer is no. The long answer is much more interesting.

The premise of your question is a bit misleading. Quantum computers do not run classical algorithms faster, rather they run algorithms specifically designed to take advantage of the quantum coherence. For many tasks this does not provide an advantage. For some very special (and occasionally useful) tasks, the advantage seems to be exponential. Simulating quantum mechanics (as in chemistry, material design, drug design) is one of these tasks. Some linear algebra tasks are also faster. Lastly, some researchers are hopeful that novel gradient-free optimizations algorithms exist for quantum hardware, but this is a bit more controversial.

Can existing quantum computers be considered evidence for parallel universes?


(tldr: noone knows)

Existing quantum computers are evidence against a strawman of the Copenhagen interpretation, which is "everything the size of an atom or smaller is quantum, everything bigger is classical, and whenever the two touch, a collapse occurs". This is of course silly and arbitrary, and has been known by physicists to be false for nearly as long as quantum mechanics has been around; there are plenty of quantum effects that involve objects bigger than an atom. Quantum computers are simply another example.

Any genuine interpretation of QM will agree on what a quantum computer outputs, because by definition, interpretations are ways of changing the words you drape around a calculation, not the result of the calculation itself. So quantum computers don't support or oppose any genuine interpretations. They only oppose strawmen.



Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact