
The Era of Quantum Computing Is Here. Outlook: Cloudy - jonbaer
https://www.quantamagazine.org/the-era-of-quantum-computing-is-here-outlook-cloudy-20180124/
======
sundarurfriend
> Note that I’ve not said — as it often is said — that a quantum computer has
> an advantage because the availability of superpositions hugely increases the
> number of states it can encode, relative to classical bits. Nor have I said
> that entanglement permits many calculations to be carried out in parallel.

I'm really glad to find they did not go for these usual pieces of pop-
explanations, and even called them out as not really capturing the essence of
quantum computing. It takes courage to not hide behind these inaccurate
descriptions, and instead just give the unsatisfying truth that there's no
simple layman level description beyond "quantum mechanics somehow creates a
“resource” for computation that is unavailable to classical devices"; at least
not with today's easily available mental models[1] we have from classical
computing.

[1]
[http://lesswrong.com/lw/kg/expecting_short_inferential_dista...](http://lesswrong.com/lw/kg/expecting_short_inferential_distances/)

~~~
Klathmon
I'm also glad they didn't fallback on the "easy" explanation. Funnily enough,
an SMBC comic [0] is what made quantum computing "click" for me, and how it's
not really easily mappable back to classical ideas. The comic was a joint
effort with Scott Aaronson and SMBC creator Zach Weinersmith, and as far as I
can tell is pretty close to a faithful explanation of the basics without
diving into the technical details.

[0] [https://www.smbc-comics.com/comic/the-talk-3](https://www.smbc-
comics.com/comic/the-talk-3)

~~~
connorelsea
If this comic isn't diving into technical details then I must be an idiot

~~~
Klathmon
I meant more that it's not trying to act like a whitepaper and "prove" the
concepts it's explaining, which I think a lot of explanations of really
complex topics end up spending way too much time on and lose the reader.

There are parts of the comic that I don't even begin to understand (I honestly
couldn't tell you if hilbert space is even a real thing, and I don't even want
to being to think about how or why amplitudes can be complex, and what you
would possibly do with a complex probability...), but the whole idea of
amplitudes being loosely analogous to probabilities,and how interference is
the real "secret sauce" that allows quantum computers to gain a big advantage
in some problems is what it finally made click for me. And it kind of got
across the point that the real difficulty with quantum computing is going to
be in isolating these "qbits" from the world (which made the originally linked
article much easier to read).

I've also attempted to read a bunch about it from many different sources
before this, so I had some primer on some of the ideas behind it, but couldn't
understand how the parts fit together, or why they were such a big deal (often
flip-flopping between the 2 feelings depending on the metaphors the last thing
I read used).

~~~
sundarurfriend
> I don't even want to being to think about how or why amplitudes can be
> complex, and what you would possibly do with a complex probability

I don't pretend to completely understand it either, but one crucial point here
is that these complex amplitudes and their interactions are the "real" reality
- they are "more real" than the normal probabilities we are used to, in that
they are the things that actually exist in the universe and give rise to the
'normal' probabilities that we see.

A lot of mental resistance to quantum mechanical ideas comes from thinking
quantum mechanics is weird, instead of thinking quantum mechanics as normal
and us being weird[1], because of the scale we live in and normally observe
things in.

[1] paraphrased from Eliezer Yudkowsky, Harry Potter and the Methods of
Rationality, Ch 36

~~~
randomsearch
To chime in on the probabilities issue: they aren’t really probabilities.

Probability amplitudes are something we don’t really understand, but it is
true that once we have set a basis for measurement, we can treat the
amplitudes with respect to that basis as probabilities (written in complex
form).

However, this should not be interpreted as “quantum states are probabilistic
mixtures of base states” - changing the measurement basis screws up that
intuition. For example, if our new basis has one base vector that is equal to
the quantum state we are measuring, suddenly our outcome is certain.

Oh I wish I could explain this better.

~~~
sundarurfriend
I feel like that explanation almost gets me there, but understanding it is
just out of reach.

Out of my many questions (for which I should probably seek books and videos),
I'll ask this one:

> changing the measurement basis screws up that intuition. For example, if our
> new basis has one base vector that is equal to the quantum state we are
> measuring, suddenly our outcome is certain.

How does one change the measurement basis - is it a mathematical exercise done
on paper, or is it an actual physical act?

Is this what is meant by collapse of a wave function?

Is this what we are doing when we 'measure' Schrodinger's cat at the end of
the experiment and suddenly have a certain outcome?

~~~
randomsearch
Different measurement instruments have different bases. But we can also model
our quantum system and imagine what would happen if we created a new
instrument with any basis we choose.

Measurement is indeed wave function collapse - when a quantum state in a
superposition (with regards to our measurement basis) is measured and produces
a definite answer.

------
reikonomusha
Rigetti Quantum Computing [0] is a YC startup that actually manufactures real
quantum computers that are accessible through Forest.

There are a couple neat videos about the tech. [1,2] You can access the
service through a Python API [3].

Disclaimer: I work there.

[0] [https://www.rigetti.com](https://www.rigetti.com)

[1] “Lisp at the Frontier of Computation”
[https://youtu.be/f9vRcSAneiw](https://youtu.be/f9vRcSAneiw)

[2] Rigetti talk at QIP 2017
[https://youtu.be/IpoASc18P5Q](https://youtu.be/IpoASc18P5Q)

[3]
[https://github.com/rigetticomputing/pyquil](https://github.com/rigetticomputing/pyquil)

------
ggm
I asked some people a few years back if some equivalent to Moores law on the
qbit size, and what it meant for cryptography applied. Their answer was that
there wasn't good evidence of a direct linkage at the qbit density we had now,
to achievable breaks in RSA, but that also the fundamental metric was probably
wrong right now. Yes, how many qbits is interesting. But also how stable they
are, how many states you can demonstrate, which specific solutions you target
in open questions which QC can address.

~~~
dsacco
My area of research is primarily post-quantum cryptography, not strict quantum
mechanics. That said, based on my understanding of the implementation details
of quantum computers (as opposed to e.g. quantum complexity), I’m pessimistic
about quantum computers being productively used against real world RSA in the
next 20 years. I’d conservatively estimate that it could happen in 50 with a
series of breakthroughs.

I think quantum computers will be useful for some applications (maybe even
many) and commercially available within the next decade or two, I just don’t
think cryptographic breaks will be possible on them until they advance well
beyond whatever is first brought to market. You need logical qubits for
polynomial-time cryptanalysis of RSA, and many of them. To get these logical
qubits, you need so many more physical qubits that it’s not really productive
to map our current records (~50 qubits) to what we’d need. In fact, I predict
we’ll get to quantum cryptanalysis via a paradigm shift that approaches the
problem in a fundamentally different way long before we master the requisite
error correction to have enough physical and logical qubits necessary under
the current paradigm.

~~~
awelkie
Are quantum computers useful for cryptanalysis in the case where you don't
quite have the necessary number of qubits? Like let's say you want to factor
an integer N, which would require n qubits, but you have a QC with only n/2
qubits. Could you use the QC to narrow down the search space and then finish
with a classical computer, or is the QC completely useless in this case?

------
seiferteric
I am out of the loop, are there any "real" quantum computers yet? I was under
the impression that no one was really making actual quantum computers yet, but
some sort of annealing machine like d-wave. Are these IBM machines finally
"real" quantum computers?

~~~
obastani
D-Wave is not a "real" quantum computer in the sense that it is not
asymptotically faster than a classical computer. Quantum computers with very
few Quibits have existed for a while -- I believe the current record is 17
quibits [1]. I believe that to factor an n-bit integer, you need roughly n
bits. So right now, we can factor 17 bit integers, which isn't very useful
yet. However, I have read that for quantum simulations, it 30 quibit quantum
computers might already be useful.

[1]
[https://en.wikipedia.org/wiki/Timeline_of_quantum_computing](https://en.wikipedia.org/wiki/Timeline_of_quantum_computing)

~~~
krastanov
It is worse than that. To factor n bit number you need roughly n "logical"
qubits, i.e. qubits that can store information almost indefinitely. The
quantum computers that we have contain low quality "physical" qubits that lose
their content after a few millisecond. You need to put an error correcting
code on top of the physical qubits to encode the logical qubits. You can
expect to need 100s or 1000s of physical qubits per logical qubit. I.e. to
decode modern public key encryption you need a computer with millions of
qubits. Today we have less than 50 in the best research hardware on the
planet.

~~~
marcosdumay
This one I don't get. Isn't "a few milliseconds" more than enough time to
gather a guess on integer factorization (and most of the useful problems)?
Wouldn't it suffice to run your program again and again over the same short-
lived qubits?

------
codekilla
For a great perspective read this:

[https://arxiv.org/abs/1801.00862](https://arxiv.org/abs/1801.00862)

This is a write up of his keynote talk at Q2b this year.

------
vtomole
There are a lot of proposals for making logical qubits. One of the more
favorable is a two dimensional surface code [0]. Current experiments suggest
that it is possible to scale superconducting processors to around 100 physical
qubits [1] of relatively high quality. It should be possible to perform
significant error correction experiments on them.

As the article points out, noisy quantum computers won't be useless. They
could be used to speed up some optimization and quantum chemistry tasks.

[0]:
[https://en.wikipedia.org/wiki/Toric_code](https://en.wikipedia.org/wiki/Toric_code)

[1]:
[https://www.nature.com/articles/s41534-016-0004-0](https://www.nature.com/articles/s41534-016-0004-0)

------
currymj
Microsoft's Q# is interesting. It's framed in a way that seems weirdly
premature, but it is probably the nicest platform to learn about and play with
quantum algorithms. Better than multiplying huge matrices in Octave, for sure.

~~~
wRastel27
It's definitely the most developer-friendly language that heavily borrows from
their functional F# language. Microsoft has put a heavy bet into simulation so
that they have a bit more of a runway before they need to produce a working
quantum computer based on topological architecture. Interesting that they are
targeting developers whereas most other languages are targeting scientists.

~~~
cbHXBY1D
It's written in F# so it makes sense it draws inspiration from it.

------
Robotbeat
The article, as carefully as it seems to be written compared to many pop sci
articles on the topic, contains many errors or mischaracterizations. For
instance:

 _Some researchers think that the problem of error correction will prove
intractable and will prevent quantum computers from achieving the grand goals
predicted for them._

The author then provides a quote intending to justify this claim, but the
quoted mathematician merely points out the difficulty of error correction (by
--correctly--pointing out that quantum error correction is more difficult than
merely proving quantum supremacy, i.e. producing a quantum result that cannot
be accurately be replicated using a classical computer), and doesn't say the
problem of error correction _will_ prove intractable.

The author seems to be painting a picture of quantum computing that is less
certain than is warranted. A counter-point to the popular hype of quantum
computing is necessary, but it shouldn't include such inaccuracies.

------
Kequc
Forgive me if I've missed it but I haven't seen any evidence that quantum
computing has been used to solve even a very simple calculation. It seems that
the purpose behind trying to build a quantum computer; is you really need to
be out in front, not necessarily that it is possible.

Are quantum computers just the fusion reactors of the computing world? Maybe
one won't be built for 20 or 30 years, maybe longer.

~~~
zitterbewegung
At this point yes . We need ~70 years of research to get to something useful (
I asked my professor about this who I studied quantum computing this).

~~~
jessriedel
This is not a typical view among experts. Most numbers quoted are between 10
and 30 years for commercially useful devices.

~~~
brain5ide
Those quotes have been around for the last 10 to 30 years, however. So 70 is a
very sane number.

~~~
krastanov
I work in the field and I will have to disagree - 20 years ago _nobody_
serious would have claimed we can build a quantum computer in 10 years. There
are very clear levels we need to reach to have something useful (in terms of
lifetime of the qubits and number of connected qubits) and there has been
extremely clear upward slope on both of those measures over the last 15 years.
Extrapolating optimistically from them in 5 years we will have quantum
chemistry simulators. Extrapolating pessimistically, it will take 20 or so
years.

~~~
michael_nielsen
Not quite true. I can think of two well funded people in 1995-2000 who either
believed (or implied to funders) that scalable quantum computing was plausible
within 10 years. But those people were outliers. You're correct that it was an
uncommon view, and mainstream consensus in the field was that it was many
decades away. The situation today is certainly very different, and people seem
much more optimistic.

------
vadansky
The discussion reignited my desire to learn quantum mechanics on the side.
Usually the Geiffiths book is recommended, is this still the best intro to
quantum mechanics?

~~~
jeffwass
I used both Griffiths and Townsend’s books in undergrad QM classes, they
complement each other nicely. It’s been awhile, but IIRC Griffiths starts off
with a focus on continuous wave functions while Townsend starts off with spin
and discrete states.

If you’re interested in quantum computing, spin is the most applicable place
to start, as a qubit can be identically represented as a spin-1/2 particle.

I wouldn’t recommend Sakurai (which someone else mentioned) if you’re just
starting. It’s what I used in grad school, and doable depending on your math
background and motivation, but not the easiest intro.

------
workthrowaway27
I wonder why we should expect practical quantum computing to be here any
sooner than practical nuclear fusion?

------
tomerbd
what will happen to RSA encryption once they are here?

~~~
pjmlp
Gone, but there are quantum safe algorithms, we just need to migrate to them
as industry.

Is it called post-quantum cryptography.

[https://en.wikipedia.org/wiki/Post-
quantum_cryptography](https://en.wikipedia.org/wiki/Post-quantum_cryptography)

------
adamnemecek
Qubits are a bad idea. You want continuous variable quantum computing.
[https://en.m.wikipedia.org/wiki/Continuous-
variable_quantum_...](https://en.m.wikipedia.org/wiki/Continuous-
variable_quantum_information)

~~~
Florin_Andrei
Those seem more similar to old school analog computers, in a sense.

~~~
krastanov
The interesting thing is that a __perfect __analog classical computer is more
powerful than the qubit quantum computer model (which is what people mean by
quantum computing). But analog computers (whether classical or quantum) do not
permit error correction, and as such are useless for anything but the smallest
problem. You need bits and qubits in the real world.

