Are these actually even useful yet? Genuine question. I never managed to solicit and answer, only long explanations which seemed to have an answer of yes and no at the same time depending on who you observe.
The long explanations boil down to this: quantum computers (so far) are better (given a million qubits) than classical computers at (problems that are in disguise) simulating quantum computers.
also last time I checked the record was 80 qubits and with every doubling of the cubits the complexity of the system and the impurities and the noise are increasing. so it's even questionable whether there will ever be useful quantum computers
Usually when people try to explain something about quantum computers, it feels like someone is trying to teach me what a monad is from the infamous example in some old haskell docs.
I'm not proud of my ignorance, and I sure hope that eventually if I get it, it'd be very useful for me. At least it worked like that for monads.
(note, I have no idea how the braiding happens, or what it means, or ... the rest of the fucking owl, but ... the part about the local indistinguishability is an important part of the puzzle, and why it helps against noise ... also have no idea what's the G-factor, but ... also have no idea what the s-wave/p-wave superconductors are, but ... https://www.reddit.com/r/AskPhysics/comments/11opcy1/comment... ... also ... phew )
Monads are a bad way to describe a fundamentally simple thing.
Quantum computing is genuinely hard. The hardware is an extremely specialized discipline. The software is at best a very unfamiliar kind of mathematics, and has basically nothing to do with programming. At best, it may one day be a black box that you can use to solve certain conventional programming problems quickly.
The issue isn't really impurities and noise, quantum error-correction solves that problem. The issue is that the supporting technologies don't scale well. Superconducting qubit computers like google's have a bunch of fancy wires coming out of the top, basically one for each qubit. You can't have a million wires that size, or even a smaller size, so the RF circuitry that sends signals down those wires needs to be miniaturized and designed to operate at near 0K so it can live inside the dilution refrigerator, which is not easy.
Microsoft's technology is pretty far behind as far as capacity but the scaling limitations are less significant and the error-correction overhead is either eliminated or smaller.
Based on what i read it seems a lot of algorithmic work is required to even make them useful. New algorithms have to be discovered and still they will only solve only a special class of problems. They cant do classical computing so your NVIDIA GPU probably may never be replaced by a Quantum GPU.
I wouldn't worry too much about finding new algorithms. The sheer power of QC parallelism will attract enough talent to convert any useful classical algorithm to QC.
It's a bit similar to the invention of fast Fourier transform (was reinvented several times...), O(n log n) is so much better than O(n*2) that many problems in science and technology use FFT somewhere in their pipeline, just because it's so powerful, even if unrelated to signal processing. For example, multiplication of very large numbers use FFT (?!).
Quantum computing is a generalization of classical computing. Thus, they CAN do classical computing. But, in practice, it'll be not as fast, more error prone and at a bigger cost.
Any basic operation you can do with reversible computing on bits, can be done with qubits.
Any basic operation you can do with normal computing on bits can be done with reversible computing on bits provided that you have enough ancillary bits to store the information that would normally be deleted in the irreversible normal operation on bits.
Hopefully not, besides quantum physics simulations the only problems they solve are the ones that should remain unsolved if we're to trust the integrity of existing systems.
As soon as the first practical quantum computer is made available, so much recorded TLS encrypted data is gonna get turned into plain text, probably destroying millions of people's lives. I hope everyone working in quantum research is aware of what their work is leading towards, they're not much better than arms manufacturers working on the next nuke.
This got me wondering how much of Tor, i2p, etc the NSA has archived. Or privacy coins like XMR.
I'm also curious. If you don't capture the key exchange but instead only a piece of cypher text. Is there a lower limit to the sample size required to attack the key? It feels like there must be.
Afaik AES is not vulnerable to any kind of statistical methods nor quantum decryption, so you'd have to capture the key exchange. Not that it makes the situation all that much better.
i vaguely remember reading an article about solving the correlation between quantum decoherence and scaling of qubit numbers. i dont understand quantum computers so take it with a grain of salt.
but here’s what perplexity says:
“Exponential Error Reduction: Willow demonstrates a scalable quantum error correction method, achieving an exponential reduction in error rates as the number of qubits increases125. This is crucial because qubits are prone to errors due to their sensitivity to environmental factors25.
”
You're certainly allowed to get excited about it as long as you're patient and don't wildly overinflate the realistic timeline to net energy production. Similarly, nobody will stop you from hyping up quantum computation as long as you're not bullshitting usecases or lying about qubit scaling.
In the wake of cryptocurrency and AI failing to live up to their outrageous levels of hype, many people on this site worry that the "feel the AGI" crowd might accidentally start feeling some other, seemingly-profitable vaporware to overhype and pump.
They can fundamentally break most asymmetric encryption, which is a good thing iff you want to do things that require forging signatures. Things like jailbreaks Apple can't patch, decryption tools that can break all E2E encryption, being able to easily steal your neighbor's Facebook login at the coffee shop...
Come to think of it, maybe we shouldn't invent quantum computers[0].
[0] Yes, even with the upside of permanently jailbreakable iPhones.
You can't use shor's algorithm with current quantum computers.
But if we were to get bigger and better quantum computers, we should use shor's algorithm. And that would, in fact, break the crypto behind HTTPS, SSH, smard-cards, and effectively all other forms of asymmetric crypto that are in use.
There is a question how likely bigger and better quantum computers are. A decent case can be made that it is unlikely they will grow fast. But it is going to far to say that shor's algorithm is useless because current quantum computers aren't good enough. You can't dismiss the possibility of quantum computer growth out of hand.
Useful exclusively for generating random numbers, just like every other "quantum computer" (at least the ones publicly announced).
Each "quantum" announcement will make it sound like they have accomplished massive scientific leaps but in reality absolutely no "quantum computer" today can do anything other than generating random numbers (but they are forced to make those announcements to justify their continued funding).
I usually get downvoted when making this statement (of fact) but please know that I don't hate these researches or their work and generally hope their developments turn into a real thing at some point (just like I hope fusion eventually turns into a real / net positive thing).