This article gets the paper totally wrong. Learning that the uncertainty principle gives rise to phenomena that popular science calls "wave-particle" duality was taught to me in quantum 101.
From the abstract: "This idea...upper bounds the sum of the interference visibility and the path distinguishability. Such wave-particle duality relations (WPDRs) are often thought to be conceptually inequivalent to Heisenberg's uncertainty principle, although this has been debated."
I'm quite surprised that anyone would think that those relationships wouldn't be direct consequences of uncertainty. They are directly analogous to position and momenta quantities.
"Here we show that WPDRs correspond precisely to a modern formulation of the uncertainty principle"
> Learning that the uncertainty principle gives rise to phenomena that popular science calls "wave-particle" duality was taught to me in quantum 101.
Agreed, just to expand on that: The hand-wavey relationship between wave-particle duality and the uncertainty principle was taught to me like this...
Imagine you have a sine wave. It has a well-defined momentum, p, given by its wavelength. However, since it's infinitely periodic in space, it has no well-defined position, x.
Now consider a wave-packet[1]. This has a very well-defined position, given by the centre of the packet, but no well-defined wavelength/momentum.
These are the limiting cases. The Heisenberg uncertainty principle, ΔpΔx >= ħ/2, just quantifies this relationship. If you try and measure x, the wavefunction ends up more 'packet-like' and you can no longer measure p as well, and if you try and measure p, the wavefunction ends up more 'wave-like' and you can no longer measure x as well. That's all there is to it.
It's not exactly rigorous, but I don't think this explanation is too misleading either (though I'd be happy to learn otherwise!).
> This article gets the paper totally wrong. Learning that the uncertainty principle gives rise to phenomena that popular science calls "wave-particle" duality was taught to me in quantum 101.
This is how it was taught to peers and I as well. I do not remember ever reading about any debate or uncertainty on the issue either.
Yeah, they're not taking a stand so much as introducing the reader to the fact there is a disagreement. You'd have to chase down the original sources to get details.
My assertion is that ~page of text has no introduction to a difference or connection between the wave/particle duality and the uncertainty principle. So there are no sources to track down.
I'm only a pop-physics fan (well, I did get a physics minor 20 years ago) but the fact that these were two sides of the same coin was always clear even to me.
I always wonder if these sorts of misinterpretations are intentional hyping on the part of the scientists for naive science writers, or whether the reporters are just that confused about things. Reminds me of the neverending stream of stories about how biologists have "overturned Darwin" by uncovering details that prove just how amazingly right Darwin was.
Perhaps I was mistaken, but I thought this was sort a necessary revelation when learning about QM. It's been years since I took Atomic Physics, so perhaps this is something much ... deeper?
Was it a serious idea that upon measurement the particle literally lost it's wave properties? Like, there were actually two separate approaches to the math? I know we made some efforts to simplify in class, approximating as particles and such due to the drastically simplified math, but we all knew that was happening; the professor was reasonably explicit about it. EDIT: As in, I remember us going over the evolution of a waveform, and how boundaries affect the solutions, and how uncertainty causes the particles to have field distributions that just happen to be the wave solution of the particle.
I'm completely serious about being confused here: QM is super easy to misunderstand, and I'd love to feel the eureka this article is trying to convey.
My quick read suggests that the authors are restating the relationship using their "Entropic View" notation which, as it turns out, has the same expression if you come at it from the uncertainty principle side or the wave equation side.
That is nice from the point of view of validating their notation but it doesn't give us anything we didn't already know.
Right. At least at Cornell (David Muller's quantum class) we were basically taught that wave-particle duality was the uncertainty principle from our earliest undergraduate experience.
Basically, the very meaning of de Broglie's law (a particle with momentum p has a wavelength L = h / p) specifies a "particle wavelength", but that wavelength can only exist within that particle's position-uncertainty. Shrink that uncertainty and you necessarily have a small L to fit within there, which means a very high p, which means that little fluctuations e.g. in the direction of p translate to large uncertainties about p. About a quarter-way into the course it is revealed that de Broglie's law essentially identifies p-space with k-space (that is, the Fourier conjugate to x-space), and you then prove that all Fourier conjugate pairs have an uncertainty principle, quite independent of quantum mechanics. So the earliest statement of wave-particle duality already implies the uncertainty principle and this has been known for a long time.
I haven't yet read the actual papers linked to be article, but I'm confused what the big breakthrough is here, the article is very light on specifics.
The generalized uncertainty principle is nothing but an application of the 'Schwartz Inequality' to two non-commuting operators. Specifically, if the commutator [A,B] = iC for operators A and B (C can be an operator or scalar, i is sqrt(-1) to preserve Hermitian-ness), then the product of uncertainties Delta A x Delta B >= <C>/2. Where Delta A = sqrt(<A^2> - <A>^2) and <A> is the expectation value of A for a given wave function, and [A,B] = AB - BA.
The wave function itself is the fundamental driver behind the wave-particle duality. And the generalized uncertainty principle just follows downstream from mathematical application. I don't see why the article (not the papers though) is saying they're two unrelated things.
FYI, this brings back annoying memories from one of my midterm exams in grad level quantum mechanics. Five questions to solve in one hour. And one question alone was to derive the generalized uncertainty relation from first principles.
Ok, cool. That explains nicely the feeling I was getting.
(And your note reminds me of some of the QM and modern physics tests I had as well - I didn't take grad work, but I still remember thinking, "Man, we spent hours deriving some of the stuff, and I've got... less than 10 minutes. Whelp...")
In particular, they have a description of why they find the traditional connection between uncertainty and wave-particle duality to be unsatisfactory. Unfortunately the technical details are beyond me, but hopefully someone else can sort them out?
Edit: Okay, some more digging suggests the following story:
(i) We have wave-particle duality => Heisenberg uncertainty, but not the other way around
(ii) With entropic uncertainty [1] (stronger than Heisenberg uncertainty), the authors show that we can obtain wave-particle duality
Again, can somebody more knowledgeable can clarify? :)
Yes, agree that log() wouldn't result from a linear operator.
The idea behind my question: does the Shannon entropic integral correspond to the L2 length of some projected state? Leading to a prescription to prepare a state with minimum uncertainty product of canonically conjugate physical quantities.
Afaik, in classical statistical physics the log() shows up when the N! in a binomial probability distribution limits to large N via the Stirling approximation. It would be interesting to find a different route for log() to enter the picture from a quantum standpoint.
That equivalence was explicitly stated on the book I learned QM. The article reads too much like "uninformed journalist writing about physics", but the paper title suggests the journalist isn't misreporting it.
Anyway, I get an error when I click on the paper. I still expect it to contain something else, besides this article's contents.
The mythical "wave or particle" interpretation of wave-particle duality is oddly persistent, and is not only pervasive in popular science but I've heard it repeated by serious physicists also, even though it's unsupported by the very math they work on.
It's not clear to me what you think the "wave or particle" interpretation is, nor what you think is wrong with it, nor what you think you've heard "serious physicists" say.
I remember something similar in my qchem class, that uncertainty basically falls out of the Fourier transform (functional equivalent of vectorial commutativity problem), and so the particle (center of probability of the waveform) automatically inherited uncertainty from the wave equation.
However, we never observe collapse to a single eigenstate of
a continuous-spectrum operator (e.g. position, momentum, or
a scattering Hamiltonian), because such eigenfunctions are
non-normalizable. In these cases, the wave function will
partially collapse to a linear combination of "close"
eigenstates (necessarily involving a spread in eigenvalues)
that embodies the imprecision of the measurement apparatus.
I suppose I just took away that the particle 'point' idea is likely just an asymptotic approximation of what the thing could be modeled as, but in reality - deep down at the atomic scales - it's really a 'linear combination of "close" eigenstates'.
The key word there is continuous-spectrum. A free electron can have any energy, and therefore the energy eigenstates are not physically realizable, due to the normalization requirements. But a bound electron, like in a quantum harmonic oscillator, has quantized energies and therefore can assume those eigenstates exactly. Think about electron orbitals in molecules: very precise energies.
Speculation: the person who wrote that has never heard of the Dirac delta function? Regardless of what happens in reality, the Dirac delta is precisely a normalized point-like eigenstate (infinite height, zero width, unit area.)
Reminds me now of the wonder I felt as after a few classes that had us doing integrals of various tedious geometries for EM fields, we come to QM - the really hard course - and we just pop a Dirac delta in to trivialize some integrals.
It's like suddenly being able to say, "Enough! Just tell me the value right there!" Which turns out to reduce to your friendly neighborhood point particle model. (Pretty much by design, if I'm not forgetting my history correctly - it's sort of a made up tool that sort of abuses notation, like the Heaviside step function.)
But deep down I never trusted that a particle was actually a basically zero dimensional-like point, though. It was just useful to think of it as one.
I love reading about these things. It seems like information is slowly being raised to the same level as energy and matter (All being different ways to look at the same phenomenon)
The universe does care about preserving information, and there are physical limits on it's propagation and consumption. This is similar to formalizing energy as a concept, which can manifest itself in many different ways (kinetic, potential, radiation, etc) Once we wrapped it up in a concept we were able to reason about it in a more abstract way. The same process is now happening with the concept of information. This is leading to breakthroughs in computing (ML, pattern recognition, etc) and physics. I am excited for the future of information theory.
> It seems like information is slowly being raised to the same level as energy and matter
Indeed, the as-of-yet unsolved Black hole information paradox [1] deals with conservation of information. Since discovery that Hawking radiation [2] carries no information theoretical phisicists are looking for ways to reconcile it with 'no-hair theorem' [3] which claims no discernable information is ever emmitted from beyond event horizon, yet information apparently is input into the black hole as more matter falls into it.
Totally with you. I'm currently fascinated by cryptography and the encoding of value. It seems to me that alot of what we do is in some way encoding value, from a less scarce form to a more scarce form, and the highest encoding of value is to be found in money. Bitcoin for example, is a conversion between alot of energy input and a little bit of information output, I wonder what different ways exist to reason about this exchange.
I recently posted a writeup of an idea I had kicking around for a while, which also has some fun similarities re many worlds interpretation. Kind of makes one wish one was a quantum physicist :).
You very well may be one, if it gets that far. I built this: https://www.stackmonkey.com/. Check out the demos and then think about your stuff on top of that. That's what I've been thinking about recently.
Take a look at David Deutsch's work on quantum mechanics, information theory, quantum computing, etc. I especially like http://arxiv.org/abs/1405.5563
Cool stuff.
(Related, possibly, is the concept that QM is simply probability with ranges from [-1,1] instead of the traditional [0,1]: Negative probabilities imply things that happen that we cannot measure. Or something like that, it's been a while.)
(Now if only I could find that wicked cool graphic showing the hierarchy of mathematics (ZFC or something near the bottom) and physics (QM and GR, sitting atop currently irreconcilable branches of math)....)
I'm no theoretical physicist, but I thought wave-particle duality was solved a long time ago by quantum field theory. Feynman, for instance, was able to formulate electromagnetism fully in terms of particles, using his path integral formulation. Since this formulation computes probability amplitudes, it seems to fit well with the uncertainty principle. This approach gave rise to QED, the most accurate physical theory ever developed.
In other words, without more detail, and not being a physicist myself, I can't tell what's actually new here.
* The wave-particle duality had its own set of formulas describing the various relationships between the wave and particle.
* There were also separate existing formulas on information uncertainty.
Now:
* Wave-particle duality can be entirely expressed within the already existing framework of information uncertainty.
What this means:
Usually, the realization that two mathematical concepts are actually equivalent leads to a better understanding of both. Concepts that were known for one can now be immediately applied to the other.
EDIT: I make no claim as to the novelty of this. This is just what I understood from the article.
You are right, quantum field theory is perfectly capable of explaining both phenomena (but not in terms of particles but rather in terms of descrete field modes). The idea that it is connected to the uncertainty principle is even older.
Reading this article made my kind of angry, as it implies that wave-particle duality was not understood until now. Like here:
> "has proved that two peculiar features of the quantum world previously considered distinct are different manifestations of the same thing"
I frequently encounter this mysterious yet-to-be-understood duality in esoteric discussions (with all kinds of wrong conclusions of course). I think that ultimatly, whenever we teach someone about the paradox behavior of particles/waves, the very next sentence should be "but this is perfectly understood and not paradox at all". While we are at it, instead of saying
> "this is what happens until you sneak a look at which slit a particle goes through"
we should say, "this is what happens unless you bombard the particle with photons, which changes everything, of course".
I'm finding it pretty hard to parse. It seems to be a reformulation of what we already know about non-commuting observables in terms of entropy/ignorance. I might need to read some of the cited papers to understand exactly if that's significant and why.
I know that many people feel that you have to write like this to get anyone to pay attention (maybe they're right), but after enough years in physics, the breathlessness of this kind of reporting really starts to grate.
> international team of researchers has proved
> made the breakthrough
> discovered the 'Rosetta Stone'
I haven't read this particular article carefully, but I think it's safe to assume that words like "reformulated," "clarified," "extended," or "embedded in a new framework" would be more appropriate. Which is fine. New and better explanations are useful and important.
But despite what you may read, we don't find a new Rosetta Stone for quantum physics every year.
I get much more excited about science writing where the reader can come away with a better understanding of an actual concept, rather than just a sense that someone somewhere is doing something hard and smart.
To put it another way, I think you are much more likely to absorb powerful ways of thinking while being entertained if you read https://what-if.xkcd.com/ than if you read http://phys.org
What If shows by example that you can apply quantitative reasoning to the most outlandish situations and actually draw interesting distinctions and conclusions.
Phys.org style reporting urges you to accept by argument from authority that the universe is mostly made of dark energy, and gravity is made out of particles traveling backwards in time through 11 dimensions or whatever it was I read last month I can't quite remember. These ideas come from actual serious research, but they aren't powerful when presented this way because you can't apply this style of reasoning successfully unless you have tons and tons of training working in the specific areas they report on.
Feynman on the "work" of gravity theorists: "something correct that is obvious and self-evident, but worked out by a long and difficult analysis, and presented as an important discovery"
To be fair, this case is the fault of the journalist(s), not necessarily the research itself.
> The particles pile up behind the slits not in two heaps as classical objects would, but in a stripy pattern like you'd expect for waves interfering. At least this is what happens until you sneak a look at which slit a particle goes through - do that and the interference pattern vanishes.
Could any of the physicists here verify whether it is the case that detecting the slit the particle passes through destroys the interference pattern. I had the impression that the path of the photons could be determined without destroying the pattern.
Tangential comment: if you are interested in quantum mechanics then you may be interested to read about "Bohmian Mechanics".
I became interested in Bohmian Mechanics after reading an email exchange between Sheldon Goldstein & Steven Weinberg [1]. It contains a few quite entertaining quotes, in particular:
> And since Bohm’s equations make exactly the same
> predictions as those of ordinary quantum mechanics,
> it is not clear what is accomplished by adding the
> complication of guiding waves, except to restore a
> sense of sanity to the whole affair.
Scott Aaronson has a brief introduction to Bohmian Mechanics at the end of his notes on decoherence and hidden variables [2]:
> Again, the amazing thing about this theory is that it's
> deterministic: specify the "actual" positions of all the
> particles in the universe at any one time, and you've specified
> their "actual" positions at all earlier and later times.
Scott's perspective is particularly interesting because he points out some limitations of Bohmian Mechanics. To paraphrase badly, the deterministic particle trajectories obtained from Bohmian Mechanics rely upon dealing with actual particles, position, momentum, in infinite dimensional spaces. It doesn't give you deterministic behaviour in the finite dimensional spaces that computer scientists prefer.
Some proponents of Bohmian Mechanics point out that the theoretical predictions of deterministic particle trajectories for the famous double-slit experiment agree with recent experimental results.
Theoretical predictions: see Figure 1, page 14 of [3], which is an adaptation of a figure from [4].
Experimental results: see Figure 3, page 1171 of [5].
Note that [5] carefully frame their experimental results as "Using weak measurements, however, it is possible to operationally define a set of trajectories for an ensemble of quantum particles".
For further reading, please see [6] for links to introductory writing about Bohmian Mechanics.
I don't find Bohmian mechanics any more satisfying than regular QM. In an effort to formulate a 'hidden state' that makes QM make.. sense.. you add a guiding wave equation that depends on the state of the universe? That's not better at all.
I'm all for trying to formulate QM in a way that doesn't offend basic sensibility, but I'm not convinced this is it. I don't think deterministic-but-nonlocal is better than nondeterministic-but-local.
Standard quantum mechanics only has the wave function. The thing you object to in Bohmian mechanics is the wave function. Both are nonlocal. You should object to both.
In particular, standard quantum mechanics gets results of experiments by collapsing the wave function. That is, it takes something which is universe spanning and it collapses it to something, with the collapse based on what a physicist does in a laboratory. That, of course, should seem rather absurd. To get around that, you have to add something either particle positions, a matter density, flashes, etc. You need results to experiments in a theory. Without something (and collapse from experiments is adding the collapse mechanism + external experiments, whatever that means), you do not get results. So the theory is a complete failure.
Only many worlds is not non-local (it adds a matter density by integrating the wave function appropriately), but they escape it by not having results of experiments at the time that history records there were results. I think. That theory makes me a little dizzy at times.
Also, the hidden state of BM is not hidden. We have a basic sense that stuff has positions. That may be an illusion, but Bohmian mechanics says that we can choose a theory that does not have that be an illusion. The thing that is hidden, that is deduced from the behavior of stuff which we see, is the wave function.
What BM explains is why it seems that our world is particle-like and why we have singular outcomes to experiments. All done naturally. Sure, the nonlocal nature is a problem that we are still trying to understand, but so what? No one has got the full story yet and so BM is viable and gives a great intuition about QM (the standard QM (operators, collapse) is a derivable formalism, like thermodynamics from Newtonian Mechanics).
From the abstract: "This idea...upper bounds the sum of the interference visibility and the path distinguishability. Such wave-particle duality relations (WPDRs) are often thought to be conceptually inequivalent to Heisenberg's uncertainty principle, although this has been debated."
I'm quite surprised that anyone would think that those relationships wouldn't be direct consequences of uncertainty. They are directly analogous to position and momenta quantities.
"Here we show that WPDRs correspond precisely to a modern formulation of the uncertainty principle"
No surprises here.