Hacker News new | past | comments | ask | show | jobs | submit login

Many commenters are comparing this to the time sharing era of classical computing. There is an important difference here: we knew that classical computers were useful at the time, even though many people failed to predict that they would become millions of times cheaper and smaller.

I’m not a physicist, but it seems to me that it’s still too early to say things like “when a quantum computer with enough qubits is available, factoring large integers will become instant and trivial.” Some experts still doubt whether quantum computing is possible at all [1]. You could not say that about classical computing before the transistor was discovered. And silicon-based computers weren’t preceded by a decade of hype about how they were about to blow vacuum tubes out of the water, just as soon as we work out the remaining engineering problems.

[1]: https://arxiv.org/abs/1908.02499

I think P ≠ NP is a better analogy. Is it true? Well, most people with an interest in the problem think so, but we don’t know. I think this announcement is about as significant as Amazon releasing some tutorials on computational complexity and giving out a few PhD scholarships for people working on P ≠ NP. Maybe this is what leads to the invention of the “quantum transistor,” but it’s too early to say that integer factorisation will become trivial.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact