Hacker News new | past | comments | ask | show | jobs | submit login
A New Law to Describe Quantum Computing’s Rise? (quantamagazine.org)
39 points by gballan 26 days ago | hide | past | web | favorite | 32 comments

With double exponential growth, “it looks like nothing is happening, nothing is happening, and then whoops, suddenly you’re in a different world,” Neven said.

This is just a clever way to spin the fact that we are experiencing growth much slower than exponential growth now into a prediction of much faster future growth, without any evidence. Or perhaps an internal joke the physicists would make. Next time I have a really flat function, I'm going to fit it with a triple exponential like so:


From the article: "In December 2018, scientists at Google AI ran a calculation on Google’s best quantum processor. They were able to reproduce the computation using a regular laptop. Then in January, they ran the same test on an improved version of the quantum chip. This time they had to use a powerful desktop computer to simulate the result. By February, there were no longer any classical computers in the building that could simulate their quantum counterparts. The researchers had to request time on Google’s enormous server network to do that."

Now, yes, this article sounds bullshitty to the extreme - it's worrying the extent to which it manages to not quite get around to any technical detail except for helpfully explaining what exponential and double-exponential means. Number of qbits? Which algorithm? Who knows? Is it even an algorithm that anyone cares about ("ran a calculation").

Also, from long experience, I'm used to seeing the person with the "next big thing" (GPGPU, FPGA, custom ASIC, etc) always managing to creatively find mediocre algorithms on the CPU to compare it to.

However - even given all that skepticism - it's hard to dismiss the growth that they are rhapsodizing about as 'flat' (assuming it's real). I suppose we'll see.

Two interacting factors are claimed to be responsible for Neven’s Law. The first is the exponential advantage q-bits have over regular bits according to the article. (But isn’t this the case only for some algorithms? i.e. Shore’s)

The next exponential factor is stated to be the “rapid improvement of quantum processors”. Really? Is the number of q-bits in true quantum computers growing exponentially—-and not steadily, but say logarithmically? Moore’s Law was a law because it was so surprising that the semiconductor industry kept it going for so long. Is that really happening with quantum computers? I didn’t realize that breakthroughs in Q computing were happening like that.

Nevin’s law seems to describe his hubris as well. I wonder how many on his own team are comfortable with that expression.

Too busy carving chips out the unobtainium wafers to worry about it, one supposes.

I always thought that the "ultimate laptop" , in other words nature, has a fundamental limit of computation due to entropy as Seth Lloyd prescribed. So we know Moore's Law isn't dead yet, crowning terms like Nevin's will create a stream of self-ordained Laws , and this will not help at all. I think it is more important to be contrarian, and be against quantum computing like Gil Kalai than push a Moore's Law narrative on mesoscopic materials. I would go against any effort to create a theory of exponentiation . I think this is such a weak paradigm since we haven't yet even figured out what a classical quantum computer would do. In a sense, quantum computing needs an upper bound due to its computational complexity rather than anticipated supremacy. https://www.quantamagazine.org/gil-kalais-argument-against-q...

> In a sense, quantum computing needs an upper bound due to its computational complexity.

You mean this? https://en.wikipedia.org/wiki/AWPP_(complexity)

IIAOPSW's law states the number of self proclaimed laws grows exponentially.

Quantum computing is rising? I was under the impression that is has been stagnating for like half a decade now. Can a QC finally factor 15? Like factor(15)=3*5?

> Can a QC finally factor 15?

For compiled versions of Shor's algorithm, yes. https://arxiv.org/abs/1202.5707

QC can simulate molecules.

So, is there anyone besides Google with a capable quantum processor? Are they that ahead of everybody?

My understanding has been that neither Google nor anyone else really has a quantum processor.

That entire series on Google is really weird. They claim to have a 78qbits computer, that would be the one this group is working on.

That should put them well on the quantum supremacy area, and make a great computer for chemistry and semiconductor research. Yet, the article is claiming the group is still trying to get quantum supremacy, and there is no sign of results from the computer.

I imagine they have a computer that doesn't really work.

Microsoft doesn't have a quantum processor but Google has one.

As the saying goes, I'll believe their processor is doing quantum computing when someone like Scott Aaronson says so.

> I'll believe their processor is doing quantum computing

What do you mean? Quantum processors have been doing quantum computing for a while now. That is, initializing qubits to a known state, performing quantum gates on them, then performing measurements on the qubits. The validity of these experiments have been confirmed by state tomography.

I haven't kept up in the past ~3 years, but as of back then (and I have not heard to the contrary thus far) there were serious doubts on whether anyone has gotten quantum speedup or not.


Getting a quantum speedup and doing quantum computing are different things. Also, your link refers to a quantum annealer not a universal quantum processor.

Microsoft has one. They even offer a primer course on Brilliant.org to teach this to inquisitive souls.

Do they have one or they let you kinda-sorta simulate using one like IBM?

“Neven says that Google’s best quantum chips have recently been improving at an exponential rate.”

That may be the real news here. Are there any other sources about this?

So how many Qbits are they up to? And how fast is that number increasing?

Comparing to classical like this is kind of dumb since quantum is known to be exponentially faster. Or doubly or whatever. Just stop comparing and tell us how many bits.

Double exponential growth is crazy fast, but some problem grow much faster and follow the hyperoperation sequence growth rate. Knuth wrote a fun paper in Science in 1976 [1]. (Easy & interesting but sadly my link is to a paywall.)

That was the first place I saw his arrow notation to describe some of these numbers.

  5↑3 = 5*(5*5)
  5↑↑3 = 5↑(5↑5)
  5↑↑↑3 = 5↑↑(5↑↑5)
  5↑↑↑↑3 = 5(↑^4)3
These numbers get BIG, the up arrow notation can express the Ackermann function.

[1] https://science.sciencemag.org/content/194/4271/1235

If the law is true, and quantum computing is the key enabler of AGI, then the law is either the best or worst news in human history. And we may be living in the final generations of our species. Not to be dramatic. I'm kinda hoping that Nevin is an inferior prophet to Moore.

I fail to see why the advent of qbits has to be any more apocalyptic than the transistor.

"...there shall, in that time, be rumors of things going astray, errrm, and there shall be a great confusion as to where things really are, and nobody will really know where lieth those little things wi - with the sort of raffia work base that has an attachment. At this time, a friend shall lose his friend's hammer and the young shall not know where lieth the things possessed by their fathers that their fathers put there only just the night before, about eight o'clock." https://www.imdb.com/title/tt0079470/quotes?item=qt0979932

If Nevin's law is true, then qbit computational power will grow at a doubly exponentially rate rather than simply exponentially as in Moore's law of bit pushers. That alone is apocalyptic.

But that's not the qbit's only trick. They have a talent for pattern recognition that bits can't match, and pattern recognition is the key to autonomous agents. For good or ill, such agents have the power to leverage human will enormously. That has a high potential for apocalypse, and doubly exponential qbits would make that happen much sooner.

Unfortunately, you can't just throw any computer program into a quantum computer and make it fast. You need to design a quantum algorithm, which will make the quantum computer faster, but only when solving this one specific problem. Not many such quantum algorithms have been invented yet.

Therefore, it is not obvious whether quantum computing will affect AI at all. Perhaps someone will figure out a new quantum algorithm for AI, but I don't think that's very likely to happen soon.

This is the hill that DWave died on with "Rose's Law." Make 3 chips small chips, make sweeping predictions on a log plot, call it a law. Yawn.

My biggest disappointment is in Scott Aaronson. He should know the difference between making something useful and making something hard to simulate.

>He should know the difference between making something useful and making something hard to simulate.

You're talking about the guy who showed the computational complexity of Boson Sampling.

> He should know the difference between making something useful and making something hard to simulate.

Please provide evidence of this claim.

I said should. He's branded himself as a critic of quantum computing... this take of his gave me whiplash. Is he... gushing?

Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact