I’m not a physicist, but it seems to me that it’s still too early to say things like “when a quantum computer with enough qubits is available, factoring large integers will become instant and trivial.” Some experts still doubt whether quantum computing is possible at all . You could not say that about classical computing before the transistor was discovered. And silicon-based computers weren’t preceded by a decade of hype about how they were about to blow vacuum tubes out of the water, just as soon as we work out the remaining engineering problems.
I think P ≠ NP is a better analogy. Is it true? Well, most people with an interest in the problem think so, but we don’t know. I think this announcement is about as significant as Amazon releasing some tutorials on computational complexity and giving out a few PhD scholarships for people working on P ≠ NP. Maybe this is what leads to the invention of the “quantum transistor,” but it’s too early to say that integer factorisation will become trivial.