This is just a clever way to spin the fact that we are experiencing growth much slower than exponential growth now into a prediction of much faster future growth, without any evidence. Or perhaps an internal joke the physicists would make. Next time I have a really flat function, I'm going to fit it with a triple exponential like so:
Now, yes, this article sounds bullshitty to the extreme - it's worrying the extent to which it manages to not quite get around to any technical detail except for helpfully explaining what exponential and double-exponential means. Number of qbits? Which algorithm? Who knows? Is it even an algorithm that anyone cares about ("ran a calculation").
Also, from long experience, I'm used to seeing the person with the "next big thing" (GPGPU, FPGA, custom ASIC, etc) always managing to creatively find mediocre algorithms on the CPU to compare it to.
However - even given all that skepticism - it's hard to dismiss the growth that they are rhapsodizing about as 'flat' (assuming it's real). I suppose we'll see.
The next exponential factor is stated to be the “rapid improvement of quantum processors”. Really? Is the number of q-bits in true quantum computers growing exponentially—-and not steadily, but say logarithmically? Moore’s Law was a law because it was so surprising that the semiconductor industry kept it going for so long. Is that really happening with quantum computers? I didn’t realize that breakthroughs in Q computing were happening like that.
You mean this? https://en.wikipedia.org/wiki/AWPP_(complexity)
For compiled versions of Shor's algorithm, yes. https://arxiv.org/abs/1202.5707
That should put them well on the quantum supremacy area, and make a great computer for chemistry and semiconductor research. Yet, the article is claiming the group is still trying to get quantum supremacy, and there is no sign of results from the computer.
I imagine they have a computer that doesn't really work.
What do you mean? Quantum processors have been doing quantum computing for a while now. That is, initializing qubits to a known state, performing quantum gates on them, then performing measurements on the qubits. The validity of these experiments have been confirmed by state tomography.
That may be the real news here. Are there any other sources about this?
Comparing to classical like this is kind of dumb since quantum is known to be exponentially faster. Or doubly or whatever. Just stop comparing and tell us how many bits.
That was the first place I saw his arrow notation to describe some of these numbers.
5↑3 = 5*(5*5)
5↑↑3 = 5↑(5↑5)
5↑↑↑3 = 5↑↑(5↑↑5)
5↑↑↑↑3 = 5(↑^4)3
"...there shall, in that time, be rumors of things going astray, errrm, and there shall be a great confusion as to where things really are, and nobody will really know where lieth those little things wi - with the sort of raffia work base that has an attachment. At this time, a friend shall lose his friend's hammer and the young shall not know where lieth the things possessed by their fathers that their fathers put there only just the night before, about eight o'clock." https://www.imdb.com/title/tt0079470/quotes?item=qt0979932
But that's not the qbit's only trick. They have a talent for pattern recognition that bits can't match, and pattern recognition is the key to autonomous agents. For good or ill, such agents have the power to leverage human will enormously. That has a high potential for apocalypse, and doubly exponential qbits would make that happen much sooner.
Therefore, it is not obvious whether quantum computing will affect AI at all. Perhaps someone will figure out a new quantum algorithm for AI, but I don't think that's very likely to happen soon.
My biggest disappointment is in Scott Aaronson. He should know the difference between making something useful and making something hard to simulate.
You're talking about the guy who showed the computational complexity of Boson Sampling.
Please provide evidence of this claim.