The article completely ignores quantum field theory, which forms the basis of the standard model of particle physics, and in which particles are emergent (rather than fundamental) features of the respective underlying field, which is described by wave equations.
There are no good analogies I know of. I wouldn't say it's like binary in any way, but I also wouldn't say that that everything builds up from binary in CS. There is really no particular relation to (or benefits of having) computer expertise.
OK, well, that makes a little more sense. But the answer still seems to me to be a pretty obvious "no". ECS is an engineering methodology. QFT is a mathematical formalism that combines quantum mechanics and special relativity. It's really apples and oranges.
All of the complications and complexity in QM/QFT are from doing calculations that map to the extremely coarse-grained boring low-energy reality we spend all of our lives in. If you ignore that, the underlying math is exceedingly trivially formulated. It's like 2 powerpoint slides to show what CUDA arrays and ops are needed to underpin a quantum field theoretic calculation of quarks and gluons inside a proton for example.
Unfortunately you're probably not going to be satisfied with that, at some point you need to connect these simulations to real measurable parameters and this is exceedingly non-trivial :/
Starting from the other end, pure old-school undergrad quantum mechanics is really the easiest way to connect to measurable quantities and this is why it's used in all of engineering, quantum optics etc. It just provides the right amount of detail.
Only if you ignore calculations around quantum collapses, or whatever your favorite interpretation calls those. We don't yet know how to calculate when those will happen, so that is by far the most complex task since it hasn't been solved yet.
I suspect that solving those collapses is key to connecting quantum mechanics to relativity, seems like a much more reasonable place to look than to invent even more fields like string theory does.
Well, there are no experiments that have been done or even suggested that requires this. Except maybe the Wigner's Friend thought-experiments? They seem pretty interesting and strikes to the center of the evolution discontinuities issues.
But for all normal practical purposes you don't need to solve that and can ignore it (which is both a blessing and a curse of course)
The article says that particles are neither waves nor particles, which is not wrong (depending on your definition of “wave”), but it omits to mention what the actual fundamental ontology is in modern particle physics.
In some mathematical models, a particle is an excitation of a field, e.g. a mountain is an excitation of a height field. It makes calculations much easier. In those models, waves can be represented in many different ways.
In the real world, a particle can be a lot of different things, but usually we use this word to describe small parts of larger objects. There is no much physical difference between a small grain of sand and a planet, but we use different words for them, while we use the same word for small things, which can be different by many orders of magnitude.
In the real world, a wave, is just a form of group behavior of particles, i.e. many particles are doing the same thing at once because of an interaction between them and some energy in a system. For example, merging of black holes makes waves in a galaxy, so we can watch that in super slow motion.
Hopfions are promising. Hopfion model explains how waves can create a particle, so we will have full circle: particles -> waves -> particles -> waves...
To me, Wave/Particle means that we have observations consistent with waves and observations consistent with particles.
I don't think you really have to go beyond that - except that is not how things behave in the macro world. The problem is trying to map Wave/Particles to the macro scale. Just don't do it.
This is exactly what I learned in a philosophy of science course in college.
Not this specific topic, but that approach. Science is a model, and as soon as you're generalizing past experiments and observations, you're outside of where you should be.
I think this needs to be qualified a bit. You can make numerous observations of a single electron over a period of time and see a wave like pattern emerge, or you can take a single observation of numerous electrons and see a wave like pattern emerge, but you can never take a single observation of one electron and see a wave like pattern.
The two slit experiment still works if you emit a single particle at a time. Which means that the wave nature is not an emergent property of a group of particles, but observable in just one.
The two slit experiment, fundamentally requires more than one particle, as the slits themselves are made of matter. While one photon travels through the experiment at a time, there are many particles interacting and influencing the fields that the photon travels through.
In the information theory of Physics there are thought experiments about deep space, where one particle may be essentially all alone, in such cases the particle's decoherence becomes quite dramatic.
You can take a single measurement of a single particle and discover it in a location forbidden by classical mechanics such as the shaded area between the two slits.
It's forbidden by classical mechanics when particles are modeled like steel balls, without medium, without electromagnetic properties of the particle, without noise.
For example, we can make sand to behave like water, by bubbling air through it, but it doesn't break classical mechanics.
> You can make numerous observations of a single electron over a period of time and see a wave like pattern emerge
I thought measurement along the path collapses the wave, removing the wave pattern over the full source to destination that would be displayed if there was no observation? Eg in dual slit if you observe at one of the slits, the pattern disappears. Wave-like behaviour only exists between observations.
All the observations are consistent with waves... that change in behavior when you poke them in any way.
Macroscopic waves also change in behavior when you poke them, this is not mysterious. But the relative energy of the poking to the wave on quantum mechanics is huge for out macroscopic standards.
One unfortunate effect of the presenting it as "wave-particle duality" to the laymen audience is that it's as if physicists don't have a good understanding of what's going on and are puzzled by the behavior, whereas QFT is well-established.
> This doesn’t mean that the atoms themselves are smeared out like waves; rather, what spreads is the probability distribution of them being found subsequently in a given location
No, it really does mean they're smeared out like waves. Prior to the measurement they are in superposition, relative to you. When your experiment images the atoms, that's a measurement that entangles you with the atoms.
If the observable value takes values in (A,B) then when you get entangled you end up in a state (A, measured A) + (B, measured B), each of which perceives a definite value of the measurement. The whole system would still be (to an outside, non-entangled observer, if you could pull such a thing off) in a superposition which could continue to interfere with itself, but the observer who's inside the superposition will never be able to tell.
Afaik this is the standard interpretation nowadays. Particles are waves (well, in the sense that what we call a particle is usually a momentum eigenstate that evolves in space like a wave), but the measurement process that entangles us to the causes them to come in quantized packets which we call particles.
Maybe I'm missing the point of the article somehow though...
I feel like there isn't really, not anymore, and most of the semblance of a debate is people reporting on historical debates. Although hard to be sure without, like, a survey.
At least in what I read it seems like the remaining debate is over how literally to interpret the words "many worlds", rather than whether the basic idea is correct. Ideas like entangling with the experiment and the decoherence of quantum states are very well-established.
> Ideas like entangling with the experiment and the decoherence of quantum states are very well-established.
Yes, but neither of those explains why we observe measurements to have single results. The MWI "explains" that by saying that actually, measurements have all possible results (which is what a straightforward application of the Schrodinger equation says), but we don't experience that, we experience a single result in each branch of the overall entangled wave function.
This "explanation" is treated with skepticism by many because it is completely different from how we treat individual branches of an entangled superposition in any other context besides "measurement". Normally, when two quantum systems are in an entangled superposition, we say that neither of them individually has any well-defined state at all; only the total joint system containing both of them does. We don't say that each branch of the entangled superposition is a different "world" in which each individual system has a well-defined state. The MWI has to claim that somehow when a "measurement" involved we can say that.
> Yes, but neither of those explains why we observe measurements to have single results.
IMO Everett is in fact the only interpretation that explains this quite clearly.
"Why do we observe measurements to have single results?"
"Because when you entangle with (an observable in a superposition), you enter a superposition. I.e., a mapping from states of you to states of (observable)"
People who don't believe it's possible for themselves to enter a superposition should ask themselves whether they believe atoms may, and whether they (themselves) are made of atoms.
> "Why do we observe measurements to have single results?"
"Because when you entangle with (an observable in a superposition), you enter a superposition. I.e., a mapping from states of you to states of (observable)"
This doesn't explain why we observe single results. A single result is not an entangled state. Mathematically, it's just one term of the entangled state: so observing a single result would mean the entangled state would be replaced by just one of its terms. This is what "objective collapse" interpretations say happens. But it's not what the MWI says happens.
> People who don't believe it's possible for themselves to enter a superposition
"Superposition" is not the correct term here; "entanglement" is. "Superposition" is basis dependent. "Entanglement" is not.
The relevant question is not whether people can become entangled. A state in which a measurement has a single result is not entangled. At least, not if you interpret the math the way standard QM normally interprets the math.
Edit: I should add here that this whole discussion is assuming a "realist" interpretation, where the quantum state describes the actual physical state of an individual quantum system (which might consist of subsystems, such as a measured object and the measuring device and the brain of the person reading the result off the measuring device). Not all interpretations are like that; for example, ensemble interpretations or statistical interpretations. In those interpretations the issue we are discussing doesn't even arise and "many worlds" seems like a straightforward confusion of concepts, like thinking that (to use an example from Beyond the Fringe) Venezuela must be all blue because it's colored blue on a map.
There is no real resistance I think, it's just that the various interpretations are not very relevant in practice. All of engineering that depends on quantum mechanics make do with the standard treatment that has been taught since 100 years now.
The MWI doesn't offer any other way to calculate anything (as it's an interpretation, not a theory). It's a good mental image of what happens though.
I guess at some point there will be some experiment that can be performed that requires a more advanced theoretical method than what current QM provides and then maybe the MWI (or other interpretations) get a chance.
> There's still a surprising amount of resistance for some reason.
It's not surprising at all; it's a perfectly rational unwillingness to treat our current quantum mechanics (and not even relativistic quantum field theory, but non-relativistic QM, which is already known to be just an approximation) as though it were an exactly complete fundamental theory.
Personally, it leads me to "shut up and calculate". You can use QM to make physical predictions without adopting any interpretation at all. Interpretations are not tools of physics. They're ways for people to tell themselves comforting stories about "what really happens" that have nothing to do with actually using physics to make predictions.
It's because the Copenhagen interpretation comes with the standard method of actually calculating stuff. You might call it a cheap shortcut, but it works (so far).
Please explain. I've only ever seen handwaving in pop-sci books like Sean Carrolls suggesting this (something about that the Born-rule was the "only possible equation" which seemed somewhat tautological to me), and IIRC this wasn't very encouraging either because it didn't come with any experiments that could be done to show a difference to the standard methodology and then you tend to pick the simplest method which is the one everybody else uses every day.
Would be cool if there were of course. I'm not saying it isn't interesting, I'm just saying I completely understand why most physicists don't bother as it doesn't bring anything new to their table - there are plenty of other practical quantum effects that require research as well.
> what we call a particle is usually a momentum eigenstate
Not really. In non-relativistic QM, it's a wave packet. True momentum eigenstates are not physically realizable.
In quantum field theory, "particle" is just a shorthand name for certain quantum field states. Quantum fields aren't waves or particles, they're quantum fields.
No, the superposition is a real physical thing and not just a probability distribution, the particle is in all parts of the superposition and not just randomly at one of the spots, we can see in experiments that it is in every spot. If you measure it then the superposition squared is the probability distribution of where you will find something, but it isn't just a probability.
This is what makes quantum mechanics hard, if it was just a probability distribution nobody would find that vexing its just a normal probability.
To me 'probability distribution' could be interpreted as a bunch of classical particles whose emergent behaviour resembles a wave in some way. Or a series of observations of a classical system whose errors add up to some wave-like phenomenon.
But others in this thread have mentioned that you can interfere a single electron with itself (which to my mind rules out classical-phenonema-which-appear-quantum.). And 'superposition' seems like a better word for that.
As an interested amateur, I recommend the book "Something Deeply Hidden: Quantum Worlds and the Emergence of Spacetime" by Sean Carroll as a good overview of quantum fundamentals. The book discusses several interpretations of the reality of matter at the quantum level. Dr. Carroll himself believes that everything is waves/fields at the lowest level, and a many-worlds interpretation of why matter appears to be particles when we observe it, but also discusses de Broglie–Bohm pilot wave theory and spontaneous collapse theory.
Technically the formula for force is F = dp/dt, or the derivative of momentum with respect to time.
For particles with mass the momentum of such a particle is p = mv, and so you can use that to yield F = ma. However for a massless particle like a photon, its momentum is p = E / c. If you use that momentum to describe a beam of photons being absorbed by a material, then you get F = n * E / c, where E is the average energy of the photons, n is the rate of photons per second, and c is the speed of light.
Things that move at the speed of light carry momentum, even though they have zero mass. For something moving at the speed of light, the momentum is the energy divided by c. (You can't have something with mass moving at the speed of light.)
Since the photon has nonzero energy E and is moving at c, it has mass by virtue of E=mc^2. And since E=hv, the E part is determined by the photon's frequency v, so the equivalent amount of mass is as well. It shows up as radiation pressure when the photon hits an object, just as if something tangible had collided with it.
On a sunny day, I'd guess that sunlight exerts about the same force on an acre on the Earth's surface as a postage stamp lying on the ground.
E=mc^2 doesn't mean that a photon has mass. Photons do produce gravitational effects due to their energy, but not in the same way as a particle with mass m. See, for example, https://physics.stackexchange.com/a/6222
Objects in motion have kinetic energy, and Einstein says mass and energy are equivalent. This means in a very real sense objects in motion have additional "relativistic mass". When you annihilate that photon it's energy is transferred to whatever absorbed it.
Confused? You're not alone! Physicists are trying to move away from the terms "rest mass" and "relativistic mass" for reasons including one you've already identified: what does it even mean for a photon to be "at rest"?
Something that I've been wondering about is whether the original development of quantum mechanics involved a simple "mixup" due to the duality of the mathematics involved in wave mechanics.
Imagine implementing a QED simulator: some EM source emitting billions of photons, each with a vector clock rotating to indicate the wavelength. You could code this up as an array tracking each photon.
Alternatively, at very large numbers of photons, you'd notice that each pixel on your screen would have so many (maybe millions!) that you could just simulate the aggregate behaviour of each little square patch of space instead of individual particles.
Ta-da... it's a continuum. No particles.
You can simulate waves in space either using a Monte Carlo particle simulation or by subdividing the space into finite elements and tracking exchanges over their boundaries.
Superficially the maths looks different, but the result is the same, and the finite element method has locality and makes the speed of light limit manifest.
Why do we keep insisting on covering only the particle model in text books?
The article clouds more than it illuminates. I don't think the author knows what he is trying to say. This is common in strawman "mythbusting" articles, but even more fraught when the topic is quantum mechanics.
Saying that an electron "isn't a wave" when it is in motion, because the wave is probability, not the electron, is equivalent to saying the electron doesn't exist between emissions and absorption. This is a valid interpretation, but even more conunterintuitive to novice, and raises more questions. Ultimately, arguing over vocabulary as interpretation is a distraction. What the thing does is what the thing is. Interpretations are intuitive guides.
Tl;Dr: "Wave-particle duality" is not the notion that matter is "sometimes" a particle and "sometimes" a wave. It is, at all times, its own separate category of thing, for which "particle" and "wave" are just metaphors that approximate its behavior.
One metaphor usually comes closer than the other depending on what system you're looking at, but it's never changing back and forth between some "particle state" and "wave state".
We can't say, in any satisfying way. The mathematics is uncontroversial, but all of the simple natural-language explanations fail under scrutiny.
Where is the electron in the double slit experiment? Is it a particle or a wave?
Similarly, we can't say. We don't have a good way of talking about this by analogy, or using natural language. As with the bicycle, the mathematics is bulletproof and boring.
This is not to say that quantum mechanics is unmysterious - I think it is very mysterious. However, the bicycle example shows how this characteristic, frustrating elusiveness of good natural-language descriptions is not limited to exotic quantum systems.
I think a significant aspect of this specific problem is that a notion of a "field" is very difficult to translate to everyday language no matter how easy it is for a physicist to conceptualize and model with math. Similar to coming up with a good analogy for a hash function in computer science world. These are largely very foreign concepts for everyday human life experiencers.
The link doesn’t support the notion that we can’t describe why a bike is self-stable. On the contrary, it simply contradicts one popular explanation and offers another one. In fact the story is pretty easy to describe: there are oscillating forces acting on a bike that cause it to right itself. The oscillating forces just aren’t accurately described as gyroscopic, which was one of the previous proposals.
We don’t know that there is no “natural language” story and no analogies for either bikes or quanta, what we know is we have some analogies that don’t work. History is full of instances of things we had incorrect stories for, and then found better ones. We have no reason to believe that won’t happen for photons and electrons too.
Absolutely! This is on my bucket list of home robotics projects. A stretch goal is to get steering during a wheelie, but I think this will be extra difficult. I’m guessing a well tuned PID controller will handle counter-steering just fine, but we’ll see.
This whole article and discussion reminds me of the key idea that physics is a model and map is not the territory. Reality is just its own thing, to me it seems pointless to debate if something is really wave or particle or quubaz, the question should be what insights and predictions you can get. Conversely just because you can model things with waves/particles/x doesn't mean that they are waves/particles/x.
If you fire a single particle between two slits repeatedly, the cumulative places it hits form a wave-like interference pattern, of course.
If you fire a single particle between two slits just once and it still forms a wave-like interference pattern, then surely what is being observed is more than just a probability distribution?
> just once and it still forms a wave-like interference pattern
That's not a thing. There's no pattern to observe from a single particle.
The situation you're describing doesn't exist.
Wave behavior and interference patterns can only be observed in statistically significant collections of particle interactions. Not single interactions.
Somewhat reductive. Waves behave differently than classical particles, and we can measure this on the level of individual particles even though we can't image the full pattern with a single measurement. For example: a single particle can end up in the middle of a shadow, but this shouldn't be possible classically.
You do have to assume at some point that if you fire individual electrons through the dual-slits and record their individual results, that the results are independent. If you can assume this, you're drawn to the current model which is that the electron is in a superposition (i.e. "interferes with itself") that when resolved gives the observed statistical outcome of your independent results. This seems also to be the simplest model (there has been 100 years of trying to find simpler models that reproduce these results).
What's the difference between that, and firing a singleton particles periodically? Shouldn't that just be a collection of 'single impacts'? Yet it isn't.
Where did you read about a single particle forming an interference pattern?
“The atoms themselves are only ever observed in a given experimental run as particles – just as quantum mechanics says they should be. The wavelike behaviour – which is to say, the smeared-out probability distribution – is reconstructed from many particle-like observations.”
A question for any physicists on here: is Wave/Particle duality analogous to Lisp/Binary duality? In other words, 2 different languages/models but 1 underlying reality that neither perfectly represents?
> there is no reason to say that quantum entities are ever really waves
I'm on the completely opposite end of the spectrum. I see no reason to say that quantum entities are ever really pointlike particles.
I'd rather see them as smeared wave-like entities that occasionally rapidly reshape while exchanging energy and momentum through fields as if the were two small billiard balls bouncing.
There's really nothing particle-like about those quantum objects apart from this momentum and energy exchanges (and even that is weird because it's quantized) and that evolution of their center of mass in time seems like a thing flying in space according to Newtonian dynamics.
We draw our simplest intuitions from macroscopic objects that are built of huge number of actual elemental material objects so tightly bound with one another that they are barely smeared.
It's not an accident that macroscopic object obeys the same equations that a tight quantum objects obey.
But it's a huge mistake to think that those equations that we wrote for this very bizarre state of matter that are macroscopic objects is anything primary just because the math describing them is as simple as it goes.
Take a look at ideal gas equations about pressure volume and temperature. They are childishly simple when compared to the math you'd need to accurately describe what actually happens in a gas.
Framing quantum mechanics in terms of "observation" instead of "instance of momentum and energy exchange" might be a very computationally convenient interpretation of what happens but I don't think it's real in any sense of the word.
In broader context we have very many interpretations in physics l that are the simplest possible interpretations of the mathematical abstractions of our models. With complete disregard for how sensible they seem. Even though there are completely reasonable alternative interpretations of the same math available.
Physics educators seem to delight in the quirkiness of the interpretations that theoretical physicists love because they are just their equations narrated, nothing more, nothing less, instead of exploring more reasonable interpretations or even mention that they exist.
In absence of new math, bringing new insights to our fundamental knowledge, one of the goals of physics should be to get real. New, or old but rekindled, more plausible interpretations might inspire new generations of young physicists to visit avenues less explored. Because abstract narratives we globally adopted failed to do that for many decades already.
My understanding is that what we consider as particles can be described as waves of smaller particles.
For example, atoms are fluctuating electrons, neutrons and protons, all of which are fluctuating subatomic particles and so on. And what we describe as particles are essentially the maxima of these fluctuations.
Not really. Electrons are as far as we know not composed of smaller particles, and there's good reason to think they are elementary. Basically because smashing them with things reveals no smaller structure. Whereas when you smash things into a proton it is very clear that there are smaller localized objects inside it (there is a great blog post with pictures of what a proton looks like at different energies that I can't seem to find now...)