Hacker News new | past | comments | ask | show | jobs | submit login
The Argument Against Quantum Computers and the Quantum Laws of Nature [pdf] (files.wordpress.com)
137 points by luu 48 days ago | hide | past | favorite | 23 comments



Hi everybody, thanks for the nice discussion. Here is https://youtu.be/_Yb7uIGBynU a very nice panel discussion about the Google supremacy claims moderated by Sandy Irani with Scott Aaronson, Dorit Aharonov, Boaz Barak, Sergio Boixo, Adam Bouland, Umesh Vazirani, and me (Gil Kalai). (This is related to Sections 5 and 6 in my paper.)


Thanks for keeping the paper so accessible. More experts need to write in a way that’s readable to non experts to get them interested in the topic. It takes confidence in your ideas to be willing to keep them simple.


Are you aware of any work on holographic bounds on QC? Outside of a single paper by Davies and a dismissive comment by Aaronson, I can't find anything. It seems gravity could potentially be a 'sure-Shor separator'.


More accessible lecture from Kalai: Noise stability, noise sensitivity and the quantum computer puzzle – Gil Kalai – ICM2018 https://www.youtube.com/watch?v=oR-ufBz13Eg

How Quantum Computers Fail:Quantum Codes, Correlations in Physical Systems, and Noise Accumulation http://www.ma.huji.ac.il/~kalai/Qitamar.pdf In the paper Gil Kalai presents 4 conjectures:

Conjecture 1: The process for creating a quantum error correcting code will necessarily lead to a mixture of the desired code word with undesired code words.

Conjecture 2: A noisy quantum computer is subject to noise in which information leaks for two substantially entangled qubits have a substantial positive correlation.

Conjecture 3: In any quantum computer at a highly entangled state there will be a strong effect of error-synchronization

Conjecture 4: Noisy quantum processes are subject to detrimental noise.


Scott Aarson didn't comment on this paper specifically but gave a FAQ[0] on his blog (he believes Google actually attained quantum supremacy). See also this[1] when the announcement was done.

Moreover, the naturalness argument is pretty strange. Of course complexity arguments are useless if big constants dominate the run time, but, why is it justified to assume the constants are small enough? Is it to make our life simpler? Real life is seldom that simple.

[0]: https://www.scottaaronson.com/blog/?p=4317

[1]: https://www.scottaaronson.com/blog/?p=4372


Check this video from minutephysics about Shor's algorithm: https://youtu.be/lvTqbM5Dq4Q


Upvoting in the hope that someone will explain ... anything about this. The comments so far seem to imply a level of understanding I don't have.


Just to be clear, everyone agrees on the advantages of QC and no one disputes that Google has built a quantum computer. Let me try to paraphrase the arguments.

1. QC needs qbits. 1 logical qbit needs many (tens to hundreds) of physical qbits.

2. We need Quantum Error Correction(QEC) to read cleanly from qbits. Current algorithms for QEC need about a 1000 logical qbits.

3. Current state of technology has only 50 - 200 qbits has very low signal to noise and the noise is inherent in the system (you cannot engineer it away)

4. The current systems, specifically Google’s, produce a sequence of numbers. It is possible to have a low complexity algorithm on classical computer which can produce similar sequences (i.e. pass the test for QC produced sequence)

4 is possible because reading from the qbits is noisy and such a noisy data can at best replicate a classical computer. So, the current state of the art, Noisy Intermediate State Quantum computers are not useful for any computation and we need to get to 1000+ qbits to achieve the supremacy. Of course, not everyone agrees that the noise cannot be engineered away or even if the noise is such a big deal.


Thank you! That's extremely helpful.


If anyone is curious on how qubits actually work, I wrote an article explaining from a multiverse perspective: https://medium.com/@dangish/the-quantum-computer-explained-u...


My intuition points to Kalai being right. If that's the case, I believe the greatest contribution of Quantum Computing will be advancing our understanding of QM and Thermodynamics. Are we trying to build a modern version of Maxwell's Daemon?

edit: I really don't want to discredit the QC effort. Prove or disprove, there is a lot to learn.


As a practitioner in the field of quantum computation, my advice is to be very careful with intuition.

Quantum computers are among some of the most complex pieces of engineering. They generally require a relatively sophisticated understanding of mathematics and physics, much of which is very non-intuitive (e.g., high-dimensional Lie groups, random probability distributions of probability distributions, low-temperature thermodynamics, ...). Piecing it all together requires careful threading. I think Gil’s notes don’t represent the pinnacle of rigorous care, but it is a useful “punch list” of broad considerations worth talking about.

I am not questioning your or anybody’s intuition directly or personally, but it’s easy—especially for the tech-savvy world of software engineers that frequent HN—to jump to conclusions based off of our intuitions about the mechanics and failure of (classical) systems and make (baseless) conclusions about contemporary quantum ones.


I'm doing a thought experiment. What if Prof. Kalai is right? The field of quantum computation at this point has people who attained titles like "Centennial Professor" and "Director of Engineering and Distinguished Scientist." People are on their way to make careers that will rival the luminaries of AI (another field that has not delivered even 1% of what it has promised). What would it take for these folks to acknowledge that it's not working out and find something more productive for themselves to do?


If there will be losers in the event of robust quantum computation not being realizable, they will likely be individuals who are working in the commercialization of the technology, not those academically involved in fundamental research. Programs at Google, Rigetti, IBM, D-Wave, Q-CTRL, Zapata, Horizon QC, Xanadu, etc. are all existentially tied to quantum computing through their business models[1] and they would have the financial incentive to “hold out”[2] on or not entertain opponents seriously.

I think somebody like Scott Aaronson—who has made an academic career in computer science and quantum computing—would gladly accept reality were it not to be what he hopes it is, and would be entirely willing to stand on a platform and broadcast the truth. Aaronson is in the business of understanding fundamental truths about computation, not (as far as anybody knows) being a shill for big business.

[1] Disclaimer: I’m not deeply familiar business-wise with all the companies listed, so take my “existential tie” with a grain of salt.

[2] I’m not claiming that any of these companies would conduct business dishonestly in the event that they were “wrong”.


I'm sure your comment is written in excellent faith after some reflection. While i am unable to weigh in on the issue at hand I would offer a few cautionary scraps about luminaries of a field being able to accept it's wrong.

The Intelligence Trap[1] is an excellent book that among other things explains how being very clever can make it harder to get at the truth. Formidable intellects can patch any holes that appear in their world view by constructing convincing and credible arguements.

"Science advances on funeral at time" paraphrased from Max Planck - the individuals he was thinking of were truth seeking super brains as well.

[1] https://www.goodreads.com/book/show/41817546-the-intelligenc...


> among some of the most complex pieces of engineering

Do you think it will end up in the "20 years from now" category like many other technologies? I'm on the fence regarding how fast we can actually advance, QC is tough.

I've seen the same things said about fusion energy. Fusion is fairly simple, you can do it in your garage right now if you were so inclined, however getting net positive output is truly pushing the limits of what we know and can build. Quantum Computers have far more going on in comparison.


Right now QCs are advancing. It’s possible to forever advance but still be going up against an unfavorable asymptote over which you’re unable to cross, of course, but year after year, qubit fidelities are increasing, error rates are dropping, qubit counts are increasing, and the zoo of quantum technologies (superconducting qubits, ions, neutral atoms, photons, and silicon dots... among others) are blossoming.

I’m “bullish” on the whole thing. In my opinion, it would be hugely unwise for anybody in the field right now to call it quits without seeing the technology through further. There are too many “interesting” developments happening.

I think that in about a year we will know the true nature of the Google supremacy claims as experts from many disciplines both stew and weigh in on the matter. If supremacy (continues to) hold up, that would be evidence against the “20 years away” sentiment.

I agree with what you said, QC has a lot more going on / going for it than fusion does. They both still reside in roughly the same “technological class”, but QC at least seems to be improving regularly with a combination of steady engineering as well as scientific breakthroughs.


Sometimes stuff that is 20 years from now happens in 20 years. They're up past 50 qbits and it looks like it might be another exponential growth process with doubling time ~4 years [0].

[0] https://s3.amazonaws.com/cbi-research-portal-uploads/2019/01...


Counting qubits is not a useful metric. Google actually had to go back down from 72 to 53 in order to get their quantum supremacy experiment working. Also none of these machines are error-corrected, so they cannot run useful quantum algorithms such as Shor's or Grover's algorithm. It is still an open question whether large error-corrected quantum computers are at all possible.


While trying to understand how lasers seem to 'rectify' energy in space I encountered some very unfamiliar language meant to uphold thermodynamics in this new (anisotropic?) regime. I suspect that the new vernacular could be used to describe some unintended consequences.

I think you are right in your assessment that there is understanding to be advanced.


Even if true, the thought experiment of quantum computers and Shor’s Algorithm are very interesting to think about. It was one of those “wow this is mind-blowing” moments of my computer education when I understood its implications https://en.m.wikipedia.org/wiki/Shor%27s_algorithm


This article does not refer to this one but uses the same title? What is the relation between them? https://spectrum.ieee.org/computing/hardware/the-case-agains...


To me his argument seems to be that he doesn't think that the errors in the system are independent and therefore the fidelity is lower than claimed. This begs the question - what is the implication of the errors in the system being systematic?




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: