Every now and then people (on HN and elsewhere) like to rediscover pilot-wave theory and other similar ideas that posit that billiard-ball-like-particles still describe the universe. In these interpretations the wave-function, which is the probabilistic field that modern physics uses to model particles, is just an effective representation about our knowledge about the system. I tend to see these theories as paying a price of drastic increase in complexity to retain classical intuition about the system.
The mainstream view in modern physics (for the past 30 or so years, I believe) is rather that the wavefunction is the fundamental description of reality, at least in the sense that classical effective knowledge theories are not plausible models.
My interpretation of this paper on quick glance is that it is an attempt to experimentally demonstrate that only effective knowledge theories with very strange properties would suffice to explain observations, thus strengthening the "wave-function is fundamental" viewpoint.
I'm an enthusiast at best, but I don't think that's quite right. The dominant interpretation that treats the wavefunction as made of "probability stuff" is surely the statistical/ensemble interpretations, or even "shut up and calculate." Billiard ball theories like de-Broglie Bohm are definitely a minority viewpoint, and not representative of those that deny ontological status to the wavefunction.
We might separate interpretations according to their ontology:
1. de-Broglie Bohm: The particle has a real (though hidden) configuration.
2. Many Worlds: The wavefunction is real and fundamental.
3. Ensemble/Consistent histories/Copenhagen/"Shut up and calculate": Nature is fundamentally probabilistic. Measurements have ontological status, and talking about what happens between measurements is suspect (in various ways, depending on the theory).
From what I can tell most physicists fall into category #3.
To be clear, my statement about the wave-function being probabilistic is intended to be a Copenhagen reference. My impression is that this paper intends to refute 1). Personally I think 2 and 3 are pretty much equivalent, as we don't have any reason to think we could run an experiment that would distinguish between universe splitting and wavefunction collapse.
My point of disagreement is giving the mainstream view as "the wavefunction is the fundamental description of reality". I claim that the mainstream view, as seen in type 3 theories, treats the wavefunction as a mathematical object, not a fundamental (i.e. physical) one.
For example, Griffiths (who introduced consistent histories) describes the wave function as "a mere mathematical tool or pre-probability for calculating probabilities which can, if one wants, be obtained using different solutions to Schrödinger’s equation." He goes on to contrast this to MWI and Bohmian mechanics, which treat the wave function as fundamental, are both deterministic, and are minority viewpoints. [1]
I would tend to agree -- the way I always learned quantum mechanics was that the wavefunction was merely a representation that happens to give the right answers when operated on by the Hamiltonian.
In much the same way, I don't think that particle physicists really think that matter is made of tiny strings.
I might actually buy a copy of this paper, I'd be curious to see what new experiments they have done to more closely demonstrate the physicality of the wavefunction.
> From what I can tell most physicists fall into category #3.
Most 'media physicists' fall into category #3. It is from my understanding that most 'practicing physicists', by association, fall into category #1 (QFT).
What do you think is the best way to learn of these things, i.e. physics fundamentals and all in the modern state of knowledge?
I ask from the perspective of a B.S. Physics cum computer software person... which, I think, might not be too far removed from the perspectives of most HN peeps...
Understanding current thought in the field for me is based on skimming many academic papers. That's kind of different, though.
Getting a solid understanding of the fundamentals of the various fields is more of a coursework sort of thing. There's a bunch of quantum lectures on MIT's open courseware - anything Allen Adams will be great. If you're a book kind of person I believe I enjoyed Shankar but it's been a while.
>> Every now and then people (on HN and elsewhere) like to rediscover pilot-wave theory and other similar ideas that posit that billiard-ball-like-particles still describe the universe.
Have you seen the experiments of Yves Couder?
Hypothetically, if a model like that produced an exact mathematical equivalent of quantum theory wouldn't that be more satisfying? It would also destroy some of the notions that have been mathematically proven - like EPR and Bells' inequality. A lot of people are unsatisfied with QM and looking for something more tangible - including physicists.
Apparently pilot wave theory was never soundly rejected, the crowd just went in a different direction.
A lot of people are unsatisfied with QM and looking for something more tangible - including physicists.
While I would also prefer that quantum mechanics turns out to be quite classical at its heart - with objective reality, locality and all that - it seems a pretty bad idea to expect or even assume that nature works the way we prefer it to and that it will be easily understandable for human brains. Actually, the longer we struggle to make sense of all of that, the more likely it seems to me that everything is actually radically different from the way we have looked at it for the past century.
>Hypothetically, if a model like that produced an exact mathematical equivalent of quantum theory wouldn't that be more satisfying? It would also destroy some of the notions that have been mathematically proven - like EPR and Bells' inequality.
destroying Bell's inequality would be the key. It would be very satisfying. I'd say Nobel prize and forever in history books satisfying. Unfortunately (as i myself is on the ERP side and came to a something like pivot wave on my own while trying to explain the double-slit) there is no reasonable explanation for observed Bell's violations except for the explanation by QM (which for a "realist" like me sounds just like "will of God spreading FTL" :)
Note: actually i haven't been able to find good (for my criteria) Malus law verification for single photons - the law is the basis of Bell's violation claims in photon polarization experiments. Slight modification of the law on top of Einstein's proposed modification - with resulting law being pretty reasonable for single photon situation while still preserving classic law in the case of classic light ray - actually produces the Bell violation in the classic "local realism" explanation. So i still hope a bit :)
Bell's inequality is, as you say, mathematically proven - and experimentally verified. Destroying it is about as likely as finding out that 2+2=5.
Pilot wave theory produces the same predictions (as do all interpretations of QM), so in some sense it's valid, but it's an ugly attempt to brush the fundamentals under the carpet; it makes some simple cases simpler, but it makes the really interesting QM harder. It's not wrong, in the same way that constructing Newton's laws in a rotating frame of reference is not wrong - but it's less pleasant to work with for most interesting problems, and obfuscates the physical reality.
Yves Couder has demonstrated that you can construct a physical system that obeys the mathematics of pilot wave theory. This has zero bearing on if pilot wave theory is an accurate description of subatomic physics.
>All research papers from Nature will be made free to read in a proprietary screen-view format that can be annotated but not copied, printed or downloaded...
>Annette Thomas, chief executive of Macmillan Science and Education, says that under the policy, subscribers can share any paper they have access to through a link to a read-only version of the paper’s PDF that can be viewed through a web browser.
The person who linked the original article didn't use the special link to the "free, but proprietary" format.
I don't think that's quite right. Arxiv is a third party for academics to publish paper to get informal peer review outside of journals. My guess is that any edits made based on feedback from Nature's author network as well as the actual paper in the Nature format are under Nature's copyright.
But yes, my understanding is that anybody with institution access to Nature can now share a link to the Nature publication.
> My guess is that any edits made based on feedback from Nature's author network as well as the actual paper in the Nature format are under Nature's copyright.
Half correct. The authors retain copyright for changes due to feedback (peer review), as the authors themselves would have modified the paper to address any feedback. You are correct that the actual paper in the Nature formatting (and possibly associated copy-editing) are under Nature's copyright.
Arxiv shows that the most recent revision of their document was published on January 20, 2015 (less than three weeks ago).
I suppose it is possible that Nature's version is substantially different, but if that were the case it would cast a huge shadow of doubt over Nature's editorial process, as well as the claims made in the paper itself.
Peer review and getting an article published takes at least a month if not half a year or longer. The preprint on Arxiv is most likely the final version of the paper they submitted for publication post peer review.
The epistemic view of the wavefunction explains nothing, because it attempts to put all the quantum weirdness into "our knowledge of reality" rather than "reality".
The whole argument seems to me silly, because the epistemic view requires that indistinguishable particles in fact are merely undistinguished particles. That is, the epistemic view requires that reality behaves quantum mechanically when no one is looking at it, whereas what we observe is that reality behaves quantum mechanically when no one can look at it.
These are completely different situations. In one case it is what we do know that matters--which is an epistemic state--in the other it is what we can know that matters, which is an ontic state. And it turns out that reality does limit what we can know, and this has dramatic consequences.
The clearest example of this is the heat capacity of solids. Heat capacity is a measure of how the temperature increases as you add heat energy to an object. The temperature is just the average energy per state in the material, so the heat capacity is a very direct measure of the number of states the collection of particles constituting a material can be in.
It is very simple to show that particles that cannot be distinguished have a different number of states from particles that can be distinguished even if they don't happen to be at the moment. For example, consider two coins: if they can be distinguished they have four possible states, HH, HT, TH and TT. If they can't be distinguished then HT and TH are the same state (because there is no way, even in principle, even if you're god) to distinguish the left coin from the right coin. So there are only three states.
You can do something similar with particles in a crystal lattice and show that they are indistinguishable. So it isn't a question of what we do know, but what we can know, and that is determined by ontology, not epistemology.
The problem is that the ontology of the wavefunction is non-classical in the extreme. Because it is nonlocal it violates the law of non-contradiction from the perspective of locally causal beings (us). That makes it hard to understand, but it doesn't make it any less real. It just means reality is weird, which at this point in history shouldn't come as a huge surprise to anyone.
If you have not read Jayne's treatment of the Gibbs paradox, you may enjoy it. He addresses exactly the question of what it means to say we cannot distinguish particles from each other, and how 'cannot' really means 'cannot with our current state of knowledge and tools (including known fields, forces, etc).' See his analysis of two previously indistinguishable versions of Argon which become distinguishable through new tools. I realize that this is in a discussion of classical gas mixing physics, but the principle behind it is the same. So, in short, epistimology does matter if our level of knowledge does not exhaust the ontology.
The mainstream view in modern physics (for the past 30 or so years, I believe) is rather that the wavefunction is the fundamental description of reality, at least in the sense that classical effective knowledge theories are not plausible models.
My interpretation of this paper on quick glance is that it is an attempt to experimentally demonstrate that only effective knowledge theories with very strange properties would suffice to explain observations, thus strengthening the "wave-function is fundamental" viewpoint.