
A New Law to Describe Quantum Computing’s Rise? - gballan
https://www.quantamagazine.org/does-nevens-law-describe-quantum-computings-rise-20190618/
======
cohomologo
With double exponential growth, “it looks like nothing is happening, nothing
is happening, and then whoops, suddenly you’re in a different world,” Neven
said.

This is just a clever way to spin the fact that we are experiencing growth
much slower than exponential growth now into a prediction of much faster
future growth, without any evidence. Or perhaps an internal joke the
physicists would make. Next time I have a really flat function, I'm going to
fit it with a triple exponential like so:

    
    
      https://www.wolframalpha.com/input/?i=plot+exp(exp(exp(x))),+x%3D-20..-5,+y%3D0..5

~~~
glangdale
From the article: "In December 2018, scientists at Google AI ran a calculation
on Google’s best quantum processor. They were able to reproduce the
computation using a regular laptop. Then in January, they ran the same test on
an improved version of the quantum chip. This time they had to use a powerful
desktop computer to simulate the result. By February, there were no longer any
classical computers in the building that could simulate their quantum
counterparts. The researchers had to request time on Google’s enormous server
network to do that."

Now, yes, this article sounds bullshitty to the extreme - it's worrying the
extent to which it manages to not quite get around to any technical detail
except for helpfully explaining what exponential and double-exponential means.
Number of qbits? Which algorithm? Who knows? Is it even an algorithm that
anyone cares about ("ran a calculation").

Also, from long experience, I'm used to seeing the person with the "next big
thing" (GPGPU, FPGA, custom ASIC, etc) always managing to creatively find
mediocre algorithms on the CPU to compare it to.

However - even given all that skepticism - it's hard to dismiss the growth
that they are rhapsodizing about as 'flat' (assuming it's real). I suppose
we'll see.

------
todd8
Two interacting factors are claimed to be responsible for Neven’s Law. The
first is the exponential advantage q-bits have over regular bits according to
the article. (But isn’t this the case only for some algorithms? i.e. Shore’s)

The next exponential factor is stated to be the “rapid improvement of quantum
processors”. Really? Is the number of q-bits in true quantum computers growing
exponentially—-and not steadily, but say logarithmically? Moore’s Law was a
law because it was so surprising that the semiconductor industry kept it going
for so long. Is that really happening with quantum computers? I didn’t realize
that breakthroughs in Q computing were happening like that.

------
b_tterc_p
Nevin’s law seems to describe his hubris as well. I wonder how many on his own
team are comfortable with that expression.

~~~
smitty1e
Too busy carving chips out the unobtainium wafers to worry about it, one
supposes.

------
jonas_kgomo
I always thought that the "ultimate laptop" , in other words nature, has a
fundamental limit of computation due to entropy as Seth Lloyd prescribed. So
we know Moore's Law isn't dead yet, crowning terms like Nevin's will create a
stream of self-ordained Laws , and this will not help at all. I think it is
more important to be contrarian, and be against quantum computing like Gil
Kalai than push a Moore's Law narrative on mesoscopic materials. I would go
against any effort to create a theory of exponentiation . I think this is such
a weak paradigm since we haven't yet even figured out what a classical quantum
computer would do. In a sense, quantum computing needs an upper bound due to
its computational complexity rather than anticipated supremacy.
[https://www.quantamagazine.org/gil-kalais-argument-
against-q...](https://www.quantamagazine.org/gil-kalais-argument-against-
quantum-computers-20180207/)

~~~
vtomole
> In a sense, quantum computing needs an upper bound due to its computational
> complexity.

You mean this?
[https://en.wikipedia.org/wiki/AWPP_(complexity)](https://en.wikipedia.org/wiki/AWPP_\(complexity\))

------
craftinator
Quantum computing is rising? I was under the impression that is has been
stagnating for like half a decade now. Can a QC finally factor 15? Like
factor(15)=3*5?

~~~
reikonomusha
QC can simulate molecules.

------
marcosdumay
So, is there anyone besides Google with a capable quantum processor? Are they
that ahead of everybody?

~~~
mehrdadn
My understanding has been that neither Google nor anyone else really has a
quantum processor.

~~~
vtomole
Microsoft doesn't have a quantum processor but Google has one.

~~~
mehrdadn
As the saying goes, I'll believe their processor is doing quantum computing
when someone like Scott Aaronson says so.

~~~
vtomole
> I'll believe their processor is doing quantum computing

What do you mean? Quantum processors have been doing quantum computing for a
while now. That is, initializing qubits to a known state, performing quantum
gates on them, then performing measurements on the qubits. The validity of
these experiments have been confirmed by state tomography.

~~~
mehrdadn
I haven't kept up in the past ~3 years, but as of back then (and I have not
heard to the contrary thus far) there were serious doubts on whether anyone
has gotten quantum speedup or not.

[https://www.newscientist.com/article/dn28641-experts-
doubt-g...](https://www.newscientist.com/article/dn28641-experts-doubt-
googles-claim-about-its-quantum-computers-speed/)

~~~
vtomole
Getting a quantum speedup and doing quantum computing are different things.
Also, your link refers to a quantum annealer not a universal quantum
processor.

------
ridewinter
“Neven says that Google’s best quantum chips have recently been improving at
an exponential rate.”

That may be the real news here. Are there any other sources about this?

------
phkahler
So how many Qbits are they up to? And how fast is that number increasing?

Comparing to classical like this is kind of dumb since quantum is known to be
exponentially faster. Or doubly or whatever. Just stop comparing and tell us
how many bits.

------
todd8
Double exponential growth is crazy fast, but some problem grow much faster and
follow the _hyperoperation sequence_ growth rate. Knuth wrote a fun paper in
Science in 1976 [1]. (Easy & interesting but sadly my link is to a paywall.)

That was the first place I saw his arrow notation to describe some of these
numbers.

    
    
      5↑3 = 5*(5*5)
      5↑↑3 = 5↑(5↑5)
      5↑↑↑3 = 5↑↑(5↑↑5)
      5↑↑↑↑3 = 5(↑^4)3
    

These numbers get BIG, the up arrow notation can express the Ackermann
function.

[1]
[https://science.sciencemag.org/content/194/4271/1235](https://science.sciencemag.org/content/194/4271/1235)

------
hirundo
If the law is true, and quantum computing is the key enabler of AGI, then the
law is either the best or worst news in human history. And we may be living in
the final generations of our species. Not to be dramatic. I'm kinda hoping
that Nevin is an inferior prophet to Moore.

~~~
smitty1e
I fail to see why the advent of qbits has to be any more apocalyptic than the
transistor.

"...there shall, in that time, be _rumors_ of things going astray, errrm, and
there shall be a great confusion as to where things really are, and nobody
will really know where lieth those little things wi - with the sort of raffia
work base that has an attachment. At this time, a friend shall lose his
friend's hammer and the young shall not know where lieth the things possessed
by their fathers that their fathers put there only just the night before,
about eight o'clock."
[https://www.imdb.com/title/tt0079470/quotes?item=qt0979932](https://www.imdb.com/title/tt0079470/quotes?item=qt0979932)

~~~
hirundo
If Nevin's law is true, then qbit computational power will grow at a doubly
exponentially rate rather than simply exponentially as in Moore's law of bit
pushers. That alone is apocalyptic.

But that's not the qbit's only trick. They have a talent for pattern
recognition that bits can't match, and pattern recognition is the key to
autonomous agents. For good or ill, such agents have the power to leverage
human will enormously. That has a high potential for apocalypse, and doubly
exponential qbits would make that happen much sooner.

------
klyrs
This is the hill that DWave died on with "Rose's Law." Make 3 chips small
chips, make sweeping predictions on a log plot, call it a law. Yawn.

My biggest disappointment is in Scott Aaronson. He should know the difference
between making something useful and making something hard to simulate.

~~~
vtomole
> He should know the difference between making something useful and making
> something hard to simulate.

Please provide evidence of this claim.

~~~
klyrs
I said should. He's branded himself as a critic of quantum computing... this
take of his gave me whiplash. Is he... gushing?

