Symmetric cryptography would be safe even with quantum computers (they just half the effective bit length of the cypher with Grover's algorithm, which can be compensated for by increasing the key length). Classical asymmetric cryptography would be in danger, but there are alternative systems for which no quantum based attacks are known. They can be a bit unwieldy to use and have much larger key sizes than what we currently use - but methods that are hopefully quantum-proof may exist. The Lamport signature scheme is probably one of the methods that is the easiest to understand.
Not all hope is lost for cryptography even with quantum computers.
Crypto is interesting and quantum computers do have a part in it, but it's one of the least interesting things to talk about when it comes to what quantum computers can do.
It's like at the advent of the regular computer, and the application everyone talks about is how it will be able to do accounting. It's right, but there are so many other interesting things it can do.
To offer some fodder for further discussion, how about molecular and atomic simulation? Wouldn't it be nice if we could just simulate the behavior of a molecule to predict what useful properties it has? One could imagine just exhaustively searching for molecules of interest on a quantum computer.
Or how about using a quantum computer to bootstrap itself? Using a QC to build a better QC is in the realm of its utility.
Or how about solving some of these nasty computer science problems that crop their heads up everywhere, like graph coloring (or any of its equivalent incarnations)?
Factoring integers and searching boxes are really interesting developments in the theory of quantum computation, but the above stuff, to me, sounds a lot cooler.
For quantum computers to solve np-complete problems in polynomial time would be highly surprising and seems just too good of an outcome. Especially considering the work showing exponential slowdowns for simple problems that defeat local search algorithms like QA.
That said, there'll likely be many practical problems for which quantum annealing provides speed ups which vastly outstrip classical computers.
We already do this with classical computers to a significant extent. It requires a lot of approximations, but it works fairly well for several useful applications, such as drug/protein binding prediction. I am really looking forward to quantum co-processors to help with this kind of calculation.
What about asymmetric crypto schemes executed on a quantum computer? All the discussions I've heard so far assume attackers have a quantum computer and everyone else have classical computers. To me, this seems like an unreasonable assumption. Won't there be a way to use a quantum computer to do asymmetric encryption in a way that makes it hard for it to reverse it?
(Remember that "computers" now includes your phone. I'm not sure how soon quantum chips come to phones...)
Not all asymmetric key cryptography is in danger either. Lattice-reduction and McEliece encryption schemes don't have a known weakness to post-quantum computing.
What about creating cryptography? Wikipedia has a few interesting applications:
I'm not a cryptographer - are security people excited about building all sorts of weird quantum protocols?
> “Then you let the computer check all possible solutions essentially—or a very large combination of them—and come back with an answer,” he says. In a quantum computer, there’s no mathematician cracking the problem, he says. “The laws of physics crack the problem for you.”
That sounds more like an NTM than a quantum computer. An NTM would turn its owner into a God.
> Superposition is the mind-bending observation that a particle can be in two states at the same time. Bring out your ruler to get a measurement, however, and the particle will collapse into one state or the other. And you won’t know which until you try, except in terms of probabilities.
It's not probabilities because people think of those as real numbers. The coefficients are complex numbers. There's a probability monad, for example, but no quantum monad:
The article sort of mentions it later on when they quote Scott Aaronson, but I suspect the author didn't understand and mentally reverted back to a probability model.
I wasn't able to dig up too much, but the introduction to a paper  by some Google folks provides a bit of insight into what might be possible:
> In particular, quantum simulation of molecular
energies, which enables numerically exact prediction of
chemical reaction rates, promises significant advances in
our understanding of chemistry and could enable in silico
design of new catalysts, pharmaceuticals, and materials.
edit: link to talk
That's a bit like insurance: Your house hasn't burned down yet. Great, here is another insurance premium for you.
In reality so-called quantum cryptography is completely impractical. Imagine an Internet where you can't send messages to people more than a few hundred kilometers away, with no wifi and no mobile Internet and which only works if you have pre-shared keys. That's the Quantum Internet (which apparently the EU just decided to fund with billions).
> No. In reality so-called quantum cryptography is completely impractical
What about building quantum safe protocols? Is that as a field any more interesting and/or viable?
Yes. Plenty of stuff happening in that field.
The challenge is: Right now we can choose between probably safe, but impractical (mcbits, sphincs) and practical, but with big questionmarks about how safe they are (newhope, sidh). The problem is: We really only trust crypto algos if they have been tested and researched over a long time, but lots of these things are pretty new.
> That's called post-quantum cryptography, not quantum cryptography.
As I understand it, "quantum safe" and "post-quantum" describe the same thing.
I never used the term 'quantum cryptography'
Really its not that much different than how things work now. When we send a message around the world, its going through dozens of computers to get there. And wifi can always be an end point extension.
But: Nondeterministic Turing Machine? What? How can it be nondeterministic, and still be a Turing Machine? Aren't they pretty much deterministic by definition?
Scott Aaronson has a good introductory article in Scientific American, and he also wrote a short, accessible article aimed at high school students in 2002.
Then there's Berkeley's CS 191 Qubits, Quantum Mechanics and Computers syllabus and lectures.
Tel Aviv University has lecture notes for the 1997 Quantum Information Processing course online. These are old, but the information is still fundamentally accurate if what you're looking for is a breadth-wise introduction to the topic.
There's the paper, From Cbits to Qbits: Teaching Computer Scientists Quantum Mechanics on arXiv.
I assume you have a computer science background; if that's the case, you might want to start with for light reading on the meta between classical and quantum computing, then dive into one of the lecture notes series. The first two resources are much more digestible; if you're looking for something to read in 10 minutes or so and come away with a basic understanding of the field, those are what you want to read.
(I don't have a stance on D-Wave's legitimacy, but the article is illustrative either way).
The ACM article is much better.
And PBS Infinite Series, also on youtube, has some good mathematical explanations of quantum computation and relevant algorithms.
1. You know how there's a lot of different hardware implementations for computers? There's Babbage engines, there's the silicon that we use every day, once upon a time there were vacuum tubes and such... we get a lot of freedom because the definition of computation, e.g. the definition of a Turing machine, just involves a sort of abstract manipulation of symbols but does not say anything about how those symbols are embodied in reality. The same bits that are high/low voltages in your CPU or copper cable can be flashes of light in a fiber-optic cable or aligned spin-domains in your spinning hard-drive or whether current can flow through a channel in your flash SSD. The oldest trick in the book is to represent one set of "logical" bits as a different pattern of "physical bits" (which then get represented their own way downstream) in order to correct errors or transmit heavily biased signals without net voltage bias or whatever.
Same with quantum computers. We don't know which hardware is going to scale so we've got spin qubits and superconducting flux qubits and cavities with super-high-quality mirrors pointing inward which trap photons, and all sorts of other qubits. But you can think of these as small little isolation chambers which must be shielded from each other and the rest of the world, maybe microscopic, perhaps all in a row. We have many candidates for what goes in the box and how the following operations are done.
There are some possibilities for alternative designs for "quantum computers" that are not the conventional quantum computer design, for example D-Wave is trying to do the quantum annealing thing, have lots of these things all connected. But researchers like the following approach because it is very concrete and has separable concerns which can be separately addressed.
2. There are some fundamental operations which these little rooms need to be able to do.
First off, we have the entanglement operations. These need to be able to shove these rooms together so that they affect each other. When shoved together, what matters is a set of 4 energy levels, E00, E01, E10, E11, of how much interaction energy there would be if both bits were forced to 0, if one bit was forced to 0 and the other was forced to 1, etc -- and the absolute value of these energies (not just their differences) also matters relative to the baseline energy B that all the other qubits experience while this happens.
The shoving operation is actually allowed to be pretty much anything that generates a "nontrivial" set of E00, E01, E10, E11, because it's the hardest thing to get right and yet only one little property of it ("quantum entanglement") is really needed.
The next operations are the room-phasing operations. These apply to any room by itself, and come in two forms: you can either raise the energy of the room relative to all of the other rooms (which only affects things if this room is entangled with the other rooms), or you might create an energy difference between the 0 state and the 1 state, or so. It turns out that you can think of all of these as a rotation of a sphere, so a very common model is to have "pulses" of light or magnetism or whatever in two different physical directions, with very precise control over how long that pulse lasts, to give full ability to twist these spheres that way.
The final operation is the readout operation, which requires "blasting" the qubit somehow into some really big system which can report and process the measurement. The nice thing about blasts that there is a nice chunk of theory which says that readouts don't actually have to happen before any particular time, though you might need them to happen after some entanglement step or so. There are quantum versions of all of the logical gates which can be implemented on this thing if it is too hard to blast away the room to get a bit of classical information, and then connect that to all of these lasers or whatever that are manipulating this system from these different directions.
The "blasts" are certainly different from the "pulses", the "pulses" need to basically have no real input or output, they just need to change the local environment a little bit. The "blasts" need to hit the specific device slightly harder in some way. The technical difference is that examining the outcome of a pulse you should not have any idea whether the qubit is more 0 or more 1 or anything else about it; after the blast, you should have total knowledge about whether the qubit is 0 or 1. So usually these two need to exploit some sort of very different physics.
3. So, how the computers are actually used is, we get four or more of these boxes together and try to figure out a routine to start out the first four rooms in the |0000> state, move it to the |1111> state representing the number 15, and then we try to figure out whether we can run the quantum factoring algorithm to find out that fifteen is three times five. So there is some fussing around with the thing as it's brought to ultra-cold temperatures, then there are some time-controlled pulses from which do these bit rotations, some lower-finesse entanglement steps where the particles are somehow connected for some set times, then there is maybe some other schedule of more pulses... and then finally your readout lasers blast the thing, and get 4 bits out of it, and we write down what binary number we saw. We repeat a lot and we find out, woah, 40% of the time this was 3, and 40% of the time this was 5, and 20% of the time it was something else. We then daydream of the day when this 20% becomes low enough that we can run a quantum error-correcting-code on top of this thing so that we can use three times as many qubits but the error will drop rather than increase.
4. Just to get some ideas of what the hardware might look like inside this room, let me give two examples.
In flux qubits, which are maybe coming up on 20 years old or so, there is a little superconducting loop which holds the qubit as a superposition of currents going clockwise and counterclockwise around it, the single-qubit pulses are accomplished by firing microwaves at the thing, while the actual readout is done by another superconducting loop that is in close proximity called a SQUID, this thing is sensitive to electromagnetic induction, which. You measure your ability to fire a bunch of electrons through the SQUID in order to figure out whether the current in the flux qubit is going clockwise or counterclockwise. By contrast this "bringing the rooms together" operation really needs to bring these two micrometer-sized devices into inductive contact so that they start influencing each other. And of course nothing is superconducting at room temperature so this all is happening in some big refrigerator.
Another promising thing in recent years has been the "NV qubits," which consist of a nitrogen atom sitting in one of the carbon locations of a diamond's crystal lattice, a "defect" in the otherwise perfect lattice. These can be made to 'fluoresce' or flash, so you can just see where they are and then deposit some structures on top of wherever they turned out, and the key is that you get a nice spin state in the qubit which has a nicely long lifetime even at room temperature, so you don't need the elaborate cooling setup that you need for superconductors -- and it interacts nicely with microwave pulses to do the single-qubit manipulation. There are some tricky problems with readout just because diamond has a very high index of refraction, which means light moves very slowly in it and gets bent a lot, but the bigger problems occur with trying to get these things to interact with each other since you basically just have this diamond lattice and it has some nitrogens in it and you can't really move them relative to each other or insert some sort of spin-wires into the lattice that will connect the two spins or anything like that. So it's missing this most-important property of the entangling operation. My guess is that if one of these looked like a computer it would have to have these little diamond qubits doing this process via some elaborate setup existing above the chip, some constant source of short-lived microwave photons which could be entangled with these qubits and then with each other, then immediately measured and the measurement used to tweak the NV qubits, leaving only the residual entanglement between the NV qubits. But you would see the same pattern of a computer program as some timing of pulses that are fired at these systems, combined with some timing of these entanglements-by-photons or so.
This reminds me of an experiment I saw on Sixty Symbols in which they had encased a single H20 atom inside a carbon molecule's "shell", and pelted it with radiation to see how it behaved. Aparrently the water molecule thinks it's all by itself and isn't affected by things on the "outside" of the carbon molecule.
Each quantum computer is special purpose circuit implementing single algorithm.
Because quantum computation works trough destructive interference, there is no interaction during execution. You insert the input, run it trough circuit and extract the result as classical output at some point. The whole setup science experiment looking gadget sitting in the lab (with the exception of Dwave).
Universal quantum computer (aka Quantum Turing Machine) is concept used in computational complexity theory. I'm not aware of any suggested physical implementation of universal programmable quantum computer. I don't even know if it makes any sense.
Programming the quantum computer means designing the circuit. Gates are reconfigured for each algorithm.
These models are computationally equivalent of course.
Say the observable universe has something like 10^100 subatomic particles total. I believe that by current estimates, this is an upper bound.
A quantum computer with 350 qubits would have 2^350 states, or more states than the total number of subatomic particles in the universe. A single "NOT" operation on one of those qubits would, in constant time, change half of those states. I.e., the quantum computer would do way more work than there are particles in the observable universe. For every single one of its operations.
I just don't see how the universe will let nerds on some tiny little out-of-the-way planet get away with that.
QC demonstrably works for small numbers of qubits. I just wonder if it will ever scale to the point that large problems can be tackled with QC in a cost-effective manner.
Haven't these stories had this same headline for at least 10 years now?
Also, quantum cryptography mostly provides superior key establishment/management capabilities, not superior confidentiality guarantees. It's a misnomer to state that we can achieve "stronger encryption" by "leveraging entanglement states"; most of quantum computing is useful for cryptanalytic attacks right now, not cryptographic construction.