
The quantum computing era is coming  fast - jonbaer
http://www.theguardian.com/commentisfree/2015/dec/13/the-quantum-computing-era-is-coming-qubits-processors-d-wave-google
======
hannob
This article is pretty inaccurate in a number of ways.

The whole D-Wave thing is mostly irrelevant for cryptography, which the
article doesn't mention. Even if the D-Wave devices turn out to be useful for
some special algorithms, they can't run Shor's algorithm, which is what
endangers crypto.

Also calling 1024 bit a "really long key" is a bit strange. They are already
endangered by classical computers, no need for quantum computing here.

That said: The call for postquantum crypto is right.

~~~
imaginenore
1024 bit keys are endangered? Only in some broken encryption schemes. You
can't even break 256 bits, as the number of possibilities exceeds the number
of atoms in the universe.

~~~
gherkin0
You're comparing keys from two different cryptosystems, which is a mistake. A
RSA key ("1024 bit keys") needs to be longer than a AES key ("256 bits") to
provide a given amount of security, since more conditions are placed on an RSA
key in order to get its favorable properties.

In 1999 you could break a 512-bit RSA key with a supercomputer, in 2015 you
can do it with 4 hours and $75 on EC2:
[http://arstechnica.com/security/2015/10/breaking-512-bit-
rsa...](http://arstechnica.com/security/2015/10/breaking-512-bit-rsa-with-
amazon-ec2-is-a-cinch-so-why-all-the-weak-keys/)

A 768-bit RSA keys has been factored in a large academic effort:
[https://eprint.iacr.org/2015/1000.pdf](https://eprint.iacr.org/2015/1000.pdf)

When you pick a key length, you want it to be long enough to provide security
for your communications as far into the future as practicable, and the
classical efforts to break RSA are getting uncomfortably close to 1024-bits.
Maybe in a few decades it would be practical for some adversaries to break
even perfectly made 1024-bit keys.

However, I'm not a cryptographer, so someone please correct me if I'm wrong.

~~~
baldfat
> 1024 bit keys are endangered? Only in some broken encryption schemes.

He didn't compare anything.

~~~
gherkin0
> He didn't compare anything.

No, he did:

>>> 1024 bit keys are endangered? .... You can't even break 256 bits

1024 bit key = most likely something like RSA

256 bit key = some symmetric algorithm or ECC

He's implying a 1024 bit RSA key should be safe because a 256 bit key from
some other algorithm is.

~~~
baldfat
> Only in some broken encryption schemes

RSA 1024 is a broken encryption scheme (2010). In Jan 2010 there was already
the concern it was broken or quickly to be broken.
[http://arstechnica.com/security/2010/01/768-bit-rsa-
cracked-...](http://arstechnica.com/security/2010/01/768-bit-rsa-
cracked-1024-bit-safe-for-now/)

in March 2010 [http://www.techworld.com/news/security/rsa-1024-bit-
private-...](http://www.techworld.com/news/security/rsa-1024-bit-private-key-
encryption-cracked-3214360/)

~~~
gherkin0
I mentioned your first link, the factoring of a 768-bit RSA key, in my first
comment. RSA isn't a "broken encryption scheme": it's a perfectly fine one.
It's just that you need to use key lengths long enough to defend against the
computing power available to current and expected adversaries. That paper is
just a demonstration of modern computing power, which just shows that you
should use longer keys.

I took his statement of "Only in some broken encryption schemes" to be a
misunderstood reference to stuff like this:
[http://arstechnica.com/security/2015/10/how-the-nsa-can-
brea...](http://arstechnica.com/security/2015/10/how-the-nsa-can-break-
trillions-of-encrypted-web-and-vpn-connections/) or a side channel attack.

Your second link has garbage sensationalized headline. It actually describes a
side channel attack that has nothing to do with cryptography algorithms. It's
basically equivalent to a clever way of looking over someone's shoulder.

------
lordnacho
What should a dev do to stay current, given this is happening? Say you have an
interest in algorithms in general, what do you need to read / what can you
read that connects to existing knowledge?

~~~
semi-extrinsic
TL;DR: if you're not retired before QC becomes useful (which is likely), the
"API" will have changed so much from now till then that you shouldn't worry
about it.

Think of this as an algorithm on specific hardware for solving a specific
problem. You can encode other problems into the specific problem, at a cost.
Currently this algorithm on specific hardware is not as fast as state-of-the
art algorithms on ordinary hardware (namely Selby's). But the people making
this special hardware claim (no proof) that when the special hardware (and the
problem size) is scaled up, the special hardware will be much faster than
ordinary hardware. So far no-one has disproved this assertion, but no-one has
proven or demonstrated it either.

Meanwhile, many other scientists and some companies, who refer to this type of
special hardware as "dirty QC", are working on another class of special
hardware they call "clean QC". This is mathematically proven to be much faster
than ordinary hardware when it is scaled up, but it's very very very difficult
to scale it up. But they're working on it and showing progress.

Right now, no-one knows if the "dirty" or the "clean" special hardware will be
the first to be actually useful in the real world. It's likely the first
useful real world application is more than ten years away. We don't know
exactly what the application will be, or how the special hardware then will be
programmed. It's also quite likely that most the people commenting here will
be retired before access to the special hardware will be given to anyone but
scientists.

~~~
gaze
What do you mean dirty vs clean QC? Do you mean adiabatic vs gate based?

~~~
jessriedel
Yea, that must be what he's referring to. I know a lot of folks who work on
this stuff and they never use "clean"/"dirty" terminology (although it's
reasonably apt).

~~~
semi-extrinsic
Yep. Picked up the clean/dirty terminology from Scott Aaronson:
[http://www.scottaaronson.com/blog/?p=2555#comment-963747](http://www.scottaaronson.com/blog/?p=2555#comment-963747)

~~~
jessriedel
FYI, Scott was using the quotes to try and indicate that he was making those
terms up on the spot (so it's not accurate to say that "many other scientists
and some companies" use those terms). But I can see how it might not be clear
from reading his comment.

~~~
semi-extrinsic
Thanks for pointing this out. (As you might have inferred, my original post
tried to keep the language simple.)

------
fitzwatermellow
An impending arms race between quants on Wall Street could also provide a
colossal impetus for driving QC innovation:

Quantum Computers Entice Wall Street Vowing Higher Returns:

[http://www.bloomberg.com/news/articles/2015-12-09/quantum-
su...](http://www.bloomberg.com/news/articles/2015-12-09/quantum-
supercomputers-entice-wall-street-vowing-higher-returns)

------
glxc
according to John Martinis, the quantum computing era is coming, but it is a
slow and gradual process like most research

------
55555
Where are the discussions on this topic?

[https://hn.algolia.com/?query=quantum%20google&sort=byPopula...](https://hn.algolia.com/?query=quantum%20google&sort=byPopularity&prefix=false&page=0&dateRange=pastMonth&type=story)

I would expect, of all places, people on HN to be smart enough to help me
understand these news stories. I have no idea what quantum blahblah really
means and want to know if this is a real breakthrough or just a news story.

~~~
fmstephe
My understanding is that it is both.

This really is a real breakthrough in quantum computing. Specifically for an
approach called 'quantum annealing' which is one of many (there is at least
one other) possible approaches to quantum computing.

This result is a breakthrough in two ways.

1: It demonstrates real quantum effects in the D-Wave computer. It had been
hotly debated, by experts in quantum computing, whether the D-Wave actually
used quantum effects in its computations. 2: It shows a real speedup for a
very specific instance of a very specific algorithm. This is great news, and a
very good result, for the people working on the D-Wave.

This is just a news story because

1: The D-Wave is much, 100 million times!, faster than the same algorithm
running on a conventional computer. But there are algorithms you can run on a
conventional computer which are equivalent (If someone could clarify how
equivalent they are that would be great) which are as fast as the D-Wave. So
they chose an algorithm which is good for the D-Wave and terrible on a
classical computer. 2: The problem instance they chose is very artificial and
it isn't clear that the speed up wouldn't disappear if they tried to run
actual real world instances of the problem.

I would conclude that this is a great result. It increases the understanding
of quantum computing. That is very exciting. The D-Wave doesn't appear to be
practically useful. Yet.

(I am not even the beginning of an expert in this, everything I wrote above
comes from the link below)

[http://www.scottaaronson.com/blog/?p=2555](http://www.scottaaronson.com/blog/?p=2555)

~~~
thomasahle
My understanding: The D-Wave machine isn't general quantum computing, but
hardware supporting one specific algorithm: quantum annealing. Quantum
annealing is similar to Simulated annealing, and if you compare the D-Wave
machine to simulated annealing implemented on normal, non specialized,
hardware, the D-Wave is 100,000,000 times faster.

In principal, even if the hardware only supports one algorithm, this could
tell us something about whether quantum computers are really better than
normal computers. Unfortunately the D-Wave machine/algorithm is still not much
better than simply simulating a quantum machine in normal hardware (Quantum
Monte Carlo) and worse than the conventional algorithm by Alex Selby.

It's interesting, but it still doesn't shed much light on the important
questions.

~~~
InvisibleCities
One more thing that you didn't mention, which I think is important: if you run
simulated annealing on specialized hardware and compare it to the performance
of D-Wave's machine, there is virtually no speedup for D-Wave's machine.

------
peter303
People are not convinced that D-Wave is really quantum yet. I've seen not
D-Waves presentations at the annual Supercomputing Convention and I am not
convinced yet.

------
sgt101
We have quantum safe encryption now, just not pki quantum safe, although that
is possible. We also have quantum key distribution which could be a route to
patch things if quantum pki is beaten out by quantum computing.

~~~
hannob
qkd solves no problem that quantum computers generate. At best you could
replace a symmetric cipher with it (though huge costs and very impractical),
but we don't have a problem with symmetric ciphers.

~~~
sgt101
We have a problem distributing the keys for symmetric ciphers, which is why
qkd would be useful if we can't do pki. I'm interested in the huge costs and
very impractical part of your comment - why do you think that?

~~~
hannob
You can only use qkd if you already have a shared key. If you already have
that you can also use that to distribute your keys.

Impractical and expensive because:

* You need a glas fiber line. That means no copper, no wifi, no crypto on mobile phones.

* You are limited to some tens to hundreds of kilometers. No transatlantic encrypted qkd. It doesn't work.

* Complex physical tasks with high accuracy like sending and detecting single photons - not cheap.

------
akerro
> The quantum computing era is coming… fast

No, it's not coming. It will stay there where it is now - huge corporations
that rule the world and governments. This will just build huge precipice
between small companies and people who will run AMD64 for next 20 years and
corporations that in 20 years will already have something that will replace
quantum computers. We all will be ruled by a few computers. Google
technologically runs away from their competitors. Google will gather more
data, process it faster, more accurate, will gather more data... Poor will
become poorer and rich will be richer. It's not coming fast, not in our
direction.

~~~
eveningcoffee
We are not limited to run AMD64 cores. We can also have GPU cores and when
some problems are really important to solve, we can run them on FPGA or create
an ASIC for it.

Also we can expect that there would be computing centers what resources the
mere mortals can access as a service as they can access the resources of the
computer clusters today.

Very likely universities fill have their own quantum computers. Bigger ones
first, then smaller ones.

I think that the situation is not that depressing but it definitely will
create some disparity in the beginning.

~~~
akerro
In 20years we will have AMD128 and some ASICs for personal usage, Google will
have something what will be 100M faster than quantum computer, it will be 100M
* 100M faster than quantum computer. It will contain more computing power than
rest of the world together. Every next generation of computers will be
produces in shorter time than the last one, and will be more powerful than the
last one. Cost of producing next generation computer will be cheaper then
producing ASICs for us. Are bitcoin miners the more popular ASIC?

>Very likely universities

I don't mean no one else won't have or rent any, sure they will, there are a
lot projects that will need it, we people won't have any. I think I presented
my opinion too negatively :<

~~~
eveningcoffee
_I don 't mean no one else won't have or rent any, sure they will, there are a
lot projects that will need it, we people won't have any._

If we look at the history of computing, then yes, one could be so pessimistic,
but would it be really true?

If there will be a breakthrough in the QC, then there will be sudden economic
motivation for people (companies) to use quantum computers.

This will generate the need for people who could work with such computers.
This will again generate motivation for universities to train such people and
the need to access quantum computers.

I think that this will happen much much faster than it happened with the
classical computing.

Of course I also believe that there would be no personal QC any time soon if
this is what you had in mind.

Edit: I also do not understand why you are down voted. I think that it is an
important perspective.

~~~
TheOtherHobbes
Not a downvoter, but I think there's a practical limit to how useful the data
collected by Google is - and the company is probably close to that limit
already.

I'm not sure QC can change that.

The only way to get more value would be total 24/7 Orwellian surveillance, and
I don't think that's going to be a popular option.

QC for crypto is a no-brainer. QC for anything else, including data mining/ML,
is a much fuzzier prospect. I'm not sure anyone really understands what the
practical applications could be, never mind how to use QC to make them
possible.

Any suggestion that you can take a warehouse full of web logs and tracking
stats, give it a quantum shake, and have a few million pre-qualified addicted
customers fall out is likely nonsense.

~~~
eveningcoffee
_QC for crypto is a no-brainer. QC for anything else, including data mining
/ML_

Problems in ML can be mostly proposed as an optimization task and if QC would
works out, ML would be likely its main application.

It would likely make an huge difference in solving a classification problems
as you could train your ML model with much more data much faster. I t would
also make the clustering problem much faster too as it also can be expressed
as an optimization problem.

Also there are many other optimization problems that would benefit a lot from
the speed up (regardless if it is huge constant speed up or an asymptotic
one).

