
We're Entering a Golden Era of Quantum Computing Research - jonbaer
http://asmarterplanet.com/blog/2015/04/golden-era-quantum-computing.html
======
pjungwir
As an amateur trying to read up on quantum computing, I can say these papers
are very accessible. The first is the talk referred to in the article that
launched the field, and the latter starts with a great literature review:

Feynman, Richard. "Simulating Physics with Computers," International Journal
of Theoretical Physics, vol. 21, Nos. 6/7, 1982.

Shor, Peter W. "Polynomial-Time Algorithms for Prime Factorization and
Discrete Logarithms on a Quantum Computer." 1994.

I'm also nearly done with Scott Aaronson's Quantum Computing Since Democritus,
which is a popularizing treatment derived from graduate-level lectures. It is
demanding but super fun.

So one risk to quantum computing I wonder about is the error correction. There
are lots of attempts at classical computers that purport to give the same
polynomial->exponential speedup, but on closer inspection they just "hide" the
exponential requirement, e.g. in the need for exponentially-increasing
instrument accuracy. Are there any experts here who can give a reason why
error correction in quantum computing will scale polynomially or better? It
seems like if we are hiding the exponential in quantum computers somewhere,
that's where it will be.

EDIT: I think there are _practical_ risks also, e.g. superconductors that only
operate at near-absolute-zero, but I'm asking about a theoretical risk that
just kills the idea.

~~~
xnull6guest
> Are there any experts here who can give a reason why error correction in
> quantum computing will scale polynomially or better?

Because it takes a finite number of Toffoli/Hadamard/whatever gates to error
correct local errors.

That is to say it's like check (or ECC) bits in traditional RAM. It's a couple
extra transistors per memory location. Error correcting codes in general tend
to be very lightweight and can be made to work locally - the the quantum world
is the same.

------
kolbe
I've had a difficult time getting a real answer on this, an the HN crowd seems
pretty well-educated, so I'll pose the question here: Do we even have a high
degree of certainty that quantum computing is possible? I thought that Bohmian
Mechanics was still on the table as possible description of the physical
world.

~~~
whitewhim
Bohmiian mechanics is a non-local hidden variable theory of Quantum Mechanics.
Any result from traditional QM should hold with Bohmian Mechanics, including
the ability to perform quantum computations. We have a reasonable certainty
that QM is possible thanks to a Threshold Thereom [http://arxiv.org/abs/quant-
ph/9702029](http://arxiv.org/abs/quant-ph/9702029). Additionally, it is quite
easy to run algorithms with a small amount of qubits (5-10) that show that
theory holds and should continue to hold as more qubits are added. It has
become a scaling problem, which hopefully can be solved with time.

~~~
whitewhim
"We have a reasonable certainty that QM is possible", sorry I meant to say
QC(Quantum Computing)

------
rdtsc
I was interested in it in college probably 10 years ago. Took courses, learned
about quantum circuits, bras and kets, bloch spheres (even wrote a 3D java
applet to represent one). It seemed like it was going to be a revolution based
on Quantum Computing. So compared to the excitement and the hype it hadn't
quite happened.

~~~
_sword
Physics models of quantum phenomena are fantastic and interesting, but there
can be a bit of lag between the theoretical developments underpinning the
technology and the material science and general technological advances that
make such theoretical developments real

~~~
trhway
anyway, quantum computing seems to be much closer than fusion, like few years
away vs. 30. (personally though i'm betting on fusion as we have experimental
verification of fusion where is quantum superposition is supposedly confirmed
by Bell inequalities experiments and, not having the equipment myself, looking
into papers on the experiments i see pretty big holes in them and thus
"classical" interpretations of the results)

------
xnull6guest
The thresholds needed to achieve Quantum Error Correction are extremely close
to being achieved nowadays (props in particular to Martini's group - and
Google for hiring him). It appears that D-Wave is doing _something_ Quantum -
though nobody knows what. Furthermore, DARPA and the Obama administration's
initiatives to get Probabilistic Programming adopted by the industry will
create both software that can run natively on Quantum Computers and
programmers that are able to think in the terms necessary for writing Quantum
Software. Lockheed Martin funded research has discovered software algorithms
for Quantum Computers that could (not without technical challenges) solve
systems of equations that create stealth profiles for fighter jets that are an
order of magnitude more effective.

The trajectory is looking pretty good.

~~~
bkcooper
_The thresholds needed to achieve Quantum Error Correction are extremely close
to being achieved nowadays_

Yes, they are quite close to surface code thresholds. However, being close to
the thresholds means that you need to be on the pessimistic side of how many
physical qubits you need per logical qubit. As I mentioned in my other
comment, even Martinis (whose group is, as you say, very good) is still
several orders of magnitude short in terms of the number of physical qubits
needed to implement one logical qubit, let alone the hundreds to thousands of
logical qubits necessary to do interesting computations.

Packing more qubits in seems to me like it's going to be a very challenging
problem. Looking at Martinis's recent Nature paper, 9 qubits are taking up
something like 2 x 4 mm. Making these smaller would be nontrivial for many
reasons: they each have their own microwave coupling line (which requires a
certain amount of length); the coherence properties of superconducting qubits
seem to care about how much surface you have relative to bulk metal, which is
a loser for shrinking the devices; presumably jamming them together presents
crosstalk issues; etc. Realistically, you also need to add a bunch of other
types of electronics down there to handle multiplexing a la D-Wave. I don't
know that these obstacles are insuperable, but I'm definitely a little
skeptical that this road leads to useful technology.

------
MrBra
Will OllyDbg still work on those CPU ;) ?

