"As opposed to traditional computer bits, which can encode distinctly either a one or a zero, qubits can encode a one and a zero at the same time. This property, called superposition, along with the ability of quantum states to "tunnel" through energy barriers, will some day allow quantum computers to perform optimization calculations much faster than traditional computers."
I get it, a qubit can store multiple states, but if someone could give a little more on how it does that, and why that means quantum computers will be faster someday, that'd be excellent.
Part of this is that the "more than one state at at time" applies to the whole computer, not individual bits, but the thing is that the whole "more than one state at a time" thing isn't it either.
To see this, instead of a quantum computer, let's consider a probabilistic computer. Rather than having one state, the computer at any time has a probability distribution over states -- it can "be in multiple states at once", each with some probability. Only at the end of a computation do you "collapse" it back into a single value. Except that such a computer is equivalent to an ordinary computer equipped with a TRNG, which is not much more powerful than an ordinary computer at all.
Quantum computers are often described as "trying everything at once", as if they were nondeterministic Turing machines, but they are not; perhaps, in some sense, they do try everything at once, but they cannot then automatically pick out whichever possibility worked. It only "tries everything at once" in the same sense the probabilistic computer does. The probabilistic computer can only pick out the right answer if you somehow set it up that the right answer has high probability and the wrong ones don't; it can't say "this answer is right, kill all the other possibilities now!".
The quantum computer is the same way. What makes quantum computers faster is the fact that instead of probabilities, they use amplitudes. To each state -- well, what I was calling states above, really they would be "basis states" -- would be associated an amplitude rather than a probability. Amplitudes yield probabilities -- the probability is the square of the absolute value of the amplitude -- but they do more; in particular, they can be negative. Indeed, they can be complex, but that they can be negative is usually enough. This allows one to do many more tricks with them, in quite a few ways, but in particular, it means that it's easier to set it up for the right answer to come out with high probability, because unlike probabilities, amplitudes can cancel out. All quantum algorithms, as I understand it, are about using appropriate tricks to get the amplitudes for the wrong answers to mostly cancel out.
So you can do processing really quick because you can do explore a buckload of possibilities at once.
The physical reality of the phenomenon is hard to grasp, and I'm still not totally sure I get/buy it. See http://en.wikipedia.org/wiki/Double-slit_experiment
I wonder if computing, and other sciences, will become so advanced that fewer and fewer people will be able to comprehend the fundamentals or will there be a time when things like quantum mechanics are a high school subject?
Teaching angular momentum and Clebsch-Gordan coefficients seems pointless. There should be a standard class on probability theory and statistics. Quantum theory should be introduced as a generalization of probability theory. This is the right foundation for the field, not some vague and basically meaningless statements like particles are both particles and waves.
However, that probability background is useless if they never actually calculate any probabilities. Doing angular momentum and Clesch-Gordan tables are the simplest possible quantum systems to calculate that I'm aware of. Doing an infinite square well takes too much calculus. We could always give them the wavefunction, but then they're not really learning the physics part of it and it's just an applied probability lesson. Doing Clebsh-Gordan and angular momentum calculations just requires basic algebra, but it will still give real results for honest experiments.
Of course, I'm probably biased, since a large amount of my research involves particle spin. What would you have the students calculate?
Imagine a biologist describing a creature which has human-butterfly duality. If you measure its number of legs, it turns into a human. If you measure its number of wings, it turns into a butterfly. If you measure whether it can ask for a cracker, it turns into a human. If you measure whether it's brightly colored, it turns into a butterfly. It's constantly shifting back and forth between these two completely different species. Of course, it might be simpler to just call it a parrot.
The problem with many explanations at quantum mechanics is that they present light or matter as shifting between being a particle and being a wave. A common symptom of this is that people say light has a wave-particle duality, despite the fact that the properties that they're describing are universal and not just related to the nature of light. A better description is to say that everything follows the Schrodinger equation and that, at certain limits, we see a set of properties that we refer to as particles while, under a different set of limits, we see properties that we tend to call waves. It's not that light switches between being a wave and a particle; the terms wave and particle are simply not complete enough to describe the properties that light possesses.
It's just 5 minutes. Well worth watching.
The important characteristic of quantum computation is not that it's faster or more parallel than classical computation, but that it's a generalisation of classical computation. The same way that classical mechanics describes a subset of quantum mechanics rather than disjoint phenomena, all current classical CPUs are technically "quantum" CPUs (the best kind of quantum CPUs!).
So, what does this mean?
Well, for all the software we have today, and for almost everything we know how to code today, absolutely nothing; in fact, most of what we can do on classical computers will be slower simply because it will be quite some time before a quantum computer is anywhere near as well-engineered as, say, a Core i7 – especially when you take into account that the properties of quantum mechanics make this engineering not only different but also significantly more difficult.
What we do get from the computational paradigm shift, however, is a broader set of operations which allow for a broader set of algorithms.
Take Grover's algorithm (the one mentioned in the article): http://en.wikipedia.org/wiki/Grovers_algorithm#Algorithm_ste...
If you understand the notation, what you'll see is a series of linear operations (quantum gates) being applied to vectors (qubits (classical bits are scalars)) in a quantum circuit. These can all be used to describe any operation which would be performed on, say, an x86 core (albeit more verbosely), but can also take advantage of quantum mechanical properties to manipulate non-classical information (which gets complicated to discuss).
For the canonical example of how this can significantly change computation as we know it, see Shor's algorithm, which speeds up prime number factorisation from sub-exponential time to polynomial time, and thus could potentially break RSA encryption if we had a practical quantum computer): http://en.wikipedia.org/wiki/Shors_algorithm
Aside from that, I'm sure people much smarter than I am could go on in technical depth about how, despite not really being an advancement of mythical proportions in and of itself, this new class of algorithms would be able to revolutionise areas like search, machine learning, physical/biological/chemical simulation, and so on.
Don't know what you mean by "properly doped": The biggest problem with diamond is that it is a bitch to make n-type. In fact, the conduction band is above the vacuum level!
It is why people try to use diamond as a thermionic emitter: electrons in the conduction band naturally try to leave the material.
Maybe phosphorous could be added among the carbon molecules in the diamond, similar to how silicon is doped with phosphorous to make a better semiconductor.
Maybe nitrogen or oxygen would be a better choice, for the similarity in molecule size - they wouldn't disrupt the diamond's carbon lattice structure as much.
> Maybe nitrogen or oxygen would be a better choice, for the similarity in molecule size
[Typo: Dopants are atoms not molecules.]
Many Group V atoms have been tried. See eg http://phycomp.technion.ac.il/~david/thesis/node17.html