If you want to "bust" a myth, at least understand where it comes from instead of building up strawmen.
> but it is sure that Carl Sagan, in his classic TV series Cosmos (1980), was crucial in popularising it
I would say it was popularized by teaching the gold foil experiment in middle school physics.
(That said, it seems like a minor point of the article.)
The important detail here that drives home how virtual they are is they only exist then when the regime is perturbative, for QED, at large spatial scales, and for QCD, at small spatial scales, hence why you can talk about them in hadrons like protons and neutrons as the author does. In the nonpertubative regimes, for example, strong QED, you cannot use perturbation theory and there are no "virtual photons" you can use. For example, the most basic strong field QED, the hydrogen atom, has no virtual photons in the coulomb field, you just solve the hamiltonian with the q^2/r potential given to you a priori. As the fourier transform of ~ 1/r shows you, you would need an infinite number of photons to make a couloumb field. This problem was already known from classical physics, in which the energy density of a couloumb field including the divergence at the point charge is infinite.
For example, our solar system is mostly “empty” with the mass mainly concentrated in the Sun (and Jupiter). The galaxies are mostly empty as well, so that two galaxies can “collide” without a single star hitting each other.
So there is a certain intellectual pull to try to extend the analogy (wrongly?) down to the smallest level as well.
True, but the way in which they affect each other is that they try their best to collide.
From that perspective, it's kind of weird if collisions don't happen.
Interestingly enough, when galaxies collide, the supermassive black holes at their centres do typically merge as their gravity is strong enough to attract each other.
It's by the way not particularly unlikely for stars to pass through other solar systems. Apparently about 70 000 years ago a binary star system flew through the Oort Cloud.
That’s hand wavy because if you’re imagining like grabbing the two galaxies and smashing them together… that’s like giga levels of magnitude faster dynamics than the speed of light, ha!
You’d have to like put the galaxies on top of each other and leave them there, probably for years, for gravity to get everything moving quickly enough (and then for the speed of light to even bridge the distance!) before really anything would happen. I think.
Engineer but not a physicist.
So with two galaxies colliding I would not expect the stars to hit each other as result of collision. But as result of interactions when speeds increase.
Why do we say it is mostly empty just because of 2/3 of these things don't interact? Disproving the plum-pudding model was significant but saying the atom is 99% empty space is misleading. We know there are billions of virtual particles appearing and disappearing all the time within the atom and space, they just cancel out.
Unless you think Sagan would have been surprised by the presence of an electric field (mediated by virtual particles…) between electrons and protons in an atom it’s quite likely you’re choosing an obtuse understanding of the term.
A solid wood table is not mostly empty by any common contextual definition. Photons do not pass through it freely, your hand won't pass through it freely, and the electron clouds of the carbon atoms in the wood are physical in almost every common sense of the term. They very much push back on any other electron cloud that comes near enough and are generally the only part of the atom that does the interacting.
While it might be true that almost all the mass is concentrated in the center of the atom, that's not what people mean by empty. Houses aren't considered empty just because most of the mass is in the foundation.
Anyway, for atoms we mostly care about colisions with charged particles. I'd say that they are almost empty.
Carl Sagan was just trying make some of these observations intelligible to children.
I can't take this article seriously if it doesn't engage with the discovery of the structure of the atom, and disregards the experiments where they shoot stuff through and it passes mostly unperturbed - because well its actually mostly empty.
`The association between this mass concentration and the idea that atoms are empty stems from a flawed view that mass is the property of matter that fills a space.`
Whats really happening is that for certain kinds of interactions electrical forces were important, and for others nuclear forces were important. We understand the nuclear forces by proxy of mass concentrations, and reach reasonable conclusions like "the atom is mostly empty".
Although neither is technically true it helps convey they idea that everything atomically is in a state of flux.
So if you could stop time and glance inside a atom you may think to observe something closer to the empty space idea. However this situation is impossible and the representation is almost like a tesseract represented in 3d. Impossible to represent in a way we consciously operate.
But we live in a state where things are constantly in motion and thus the existing state of an atom can be kind of though of as a cloud or maybe even better thought as an energy field with predictable states.
Maybe it's a pipe dream, but my hope is that if enough people close browser tabs in reaction to rude behaviour, it'll start to show up on analytics.
alternative one https://web.archive.org/web/20230825073946/https://aeon.co/e...
A pair of strawmen to knock down in the process, most notably in the title and opening paragraphs; the second half way through, re mass concentration. clouds of moving fluff as a lay description of electrons isn't too bad. I guess the key part is to emphasize charge in addition to mass.
Tangent: I'm surprised that, at least according to [this paper by Sebens](https://arxiv.org/abs/2105.11988), Shrodinger's model of the electron wave function corresponding to a charge density (probability aside) is controversial among physicists. I suspect most chemists doing DFT agree with it.
It's more than "controversial"; it was shown not to work way back in the 1920s. That's why Born's probability interpretation was adopted--it was the only one left standing.
> I suspect most chemists doing DFT agree with it.
If all you ever have to deal with is atoms and molecules, the Schrodinger charge density interpretation sort of works--it's at least a good enough heuristic for that domain.
But quantum mechanics gets applied to lots of other domains besides atoms and molecules. The charge density interpretation breaks down in those other domains, whereas the Born probability interpretation does not. And physicists who had to deal with those other domains figured that out, as noted above, back in the 1920s, which is why Schrodinger's charge density interpretation was discarded, at least as any kind of fundamental aspect of quantum theory.
Or in another words, you must sum many rules to calculate the energy, an one of them is easy to remember because it's the result you'd get from a cloud of density charge.
If only my college chemistry class started here. I really enjoyed physics but for some reason never grokked chem. This might’ve changed all that.
"Write 4,000 to 5000 word essay, on the follies of VSEPR (valence shell electron pair repulsion) theory. Highest scoring work will be published on aeon.co."
Debunking one myth by perpetuating another, it seems. Collapsing waves via observation doesn't mean you need to look at it as a human.
I found the idea of visualizing particles as "clouds" somewhat helpful in that regard.
Smeared matter, thanks to QM
Cool description how bout an animation or better yet an interactive App.
During molecule formation, you need more than quantum mechanics. You need quantum field theory, to create an accurate picture of the dynamics. Quantum field theory has a lot more dimensions, and there is stuff like virtual particle creation and what not. It is not at all easy to show this on a 2D screen.
this aligns with goal of the article
"We build new mental images of the quantum world one step at a time, even under the risk of tripping up here and there."
Is it? Isn't the "empty atom myth" bassically the Bohr model of an atom, which was created by Niels Bohr.
The reason why it persists is the same reason Newton's theory of gravity exists. It is relatively simple to understand, and carries a lot of explanatory power.
The author of the article is welcome to compute the mass density of the atom's periphery to the nucleus, and return to speculating whether it's fair to argue that the atom is mostly empty space or not.
For myself, I'm happy to refer to ethereal probabalistic cobwebs as "mostly" empty space.
As a thought experiment: If you have a peach, would you declare the fleshy part non-existent once the pit achieves a certain mass density?
The actual scenario is this: "If you have a peach, would you declare the fleshy part non-existent if it had a certain probability of reallocating itself to a single spot once the pit achieves a certain mass density and if you touch it, there is only a small probability that you can even feel that the flesh is there?"
The entire electron cloud produces an electro-repulsive force that is very real. That is what you feel, there's no small probability about it. This idea that there's some small probability that you hand will go straight through the table is ridiculous.
When you touch a peach, some of your electrons/atoms actually do become part of the peach and vice-versa. If we somehow turned up the tunneling probability, your hand would not go through things, it would simply dissolve into things more readily.
When interpreting electrons as particles, the atom is mostly empty. The position of the electron is not known with certainty, but the number and mass of the electrons is and it is very small compared to the size of the entire atom. Volumetrically electrons are point particles and take up no space at all.
When interpreting electrons as quantum waves the term "empty" is largely meaningless.
The Bohr planetary model, which I assume is what you're referring to, is equivalent to Newtonian mechanics. It's simple to understand and a good starting point for high schoolers. It's even still usable for certain types of calculations.
However, like Newtonian mechanics, at a deeper level and according to modern understanding, it's fundamentally wrong.
Instead we use the Schrodinger electron-cloud model. According to which the ~nucleus~ (EDIT I mean atom not nucleus) is not mostly empty.
It's not pedantic to discuss this, any more than it's pedantic to discuss why Newtonian mechanics is less accurate than relativity.
The electron cloud and the nucleus are different things. The electron cloud on it's own is an empty volume except where the electrons are at any given moment.
The article specifically rejects this way of thinking and makes the point that the electrons are not in the cloud. The electrons are the cloud. Thinking of them as little balls whizzing about so fast that it seems cloud-like is wrong.
It’s common to say they behave as a particle and a wave, but it’s equally valid to say they don’t behave as a particle or a wave. It’s a distinct phenomena and thinking in terms of large scale things you’re familiar with is simply misleading.
Quantized fields is a much better term. Also collapse of the wave function never happens, only increased entanglement.
The concept of a single particle as a point charge that isn't really interacting with everything around it is similarly flawed. We learn from a very young age to conceptualize that as the building block of matter, and entanglement as something weird. Instead it's the other way around: a single particle with a pure (unentangled) wavefunction is pretty near statistically impossible.
This article is trying to help laypeople have better intuition about things from the quantum point of view. It's full of evocative prose. Arguing about the definition of the word "full" is completely tangential to the point.
The point of physics is to have a unified system where the same description applies to the widest possible number of phenomena. Calling atoms empty or full isn’t a great description but neutrino’s behavior for example lends itself to calling them empty.
That isn't entirely a thing.
Fields may have high or low energy densities, and we might compare that to the height and frequency of waves on the ocean in explanation. But just as the ocean is violent or becalmed, not "full" or "empty", so are fields neither full or empty.
wouldn't it still be pretty meaningfully rephrased as 'stochastic density' or similar?
I would presume that a nucleus, as a superposition if its respective waves, has a far higher stochastic density than the rest of an atom. Or rephrased, an atom is mostly [near] empty space.
 of energy, or mass
This is fair. But the point is that a model which is fundamentally stochastic is more valid than one where particles are continuously moving through some subset of mostly empty space.
edit: Through their respective field - I'm guessing, and does each force have its own field, or do they also interact with each other?
Spooky distance at an action.
Interaction events create the space-time interval between them.
Or very improbable.
Yes, it's full of transient particle-antiparticle pairs which I hear are what make black holes evaporate.
Nah bro. Marie Curie gave up her life showing this isn't true. Maybe the author is confusing "static" with "stable"?
I think that what he means that whatever vibrational or rotational internal states the molecule would have, the molecule is almost always in a superposition of all of its possible states (a "smudge" of all possible vibrations, so to say), and because the states are cyclic/vibrational, there is no time evolution.
It's not even required that the dependencies be linear in time or discrete, except in the narrow case of Turing-computability.
I'm asking these questions rhetorically, but they're serious questions that need to be answered (or at least attempted) to maintain even a pretense of intellectual coherency.
You certainly can make the constructivist argument that only the rationals exist and real numbers and everything that builds on them is some kind of fever dream, but personally I've never seen any even remotely compelling exposition of that position. Maybe that's just a "me problem" though? I really don't know I find that this kind of metaphysic pushes up to and sometimes past my cognitive ability.
Check out the works and interviews of Joscha Bach if you haven’t already, he’s influenced my thinking on this quite a bit.
*obviously not in the dismissive sense, but in the sense as “we hold these truths to be self-evident”
I’m not a computer scientist so I may have gap here, but demonstrating two quantities are incommensurable (showing no unit makes up two quantities m and n times, m and n being integers), does not seem like something possible to approximate empirically or computationally in many cases. The precision required may be one step beyond your current capacity.
Constructivism may be right. But I don’t have a good argument for why finished computation or empirical approximation (there’s always limits to measurement) is the be all end all. Unless we take them to be the final adjudicators, why shouldn’t there be incommensurable quantities? We need very strong arguments they provide the final say, but we know they have limits, their capacity/memory.
It is also worth pointing out that the basic mathematics of deep learning are quite old and relatively simple. It was actually the technological advancement of being able to economically perform trillions of grade-school arithmetical operations per second that unlocked it, not some "mathematical discovery".
Everything can be described (read: approximated, modeled) in mathematical terms; that's the whole point! That doesn't mean mathematical objects and processes must exist independently of those descriptions.
The basic, most fundamental principles on which mathematics is based are surely natural and discoverable.
Other types of mathematics are born out of these principles.
> one widget
And what's a widget? This implies there are "objects" and an object has its "boundry". Counting is completely human-defined. You can count earth as one and mars as two, or you can count solar system as one and an atom in another galaxy as two. It's a man-made system that help us think.
Or you can count particle by particle... well never mind, we're in a thread of an article about why particles are actually probability clouds :)
Why is my mug is one object, instead of two, or three, or 10^26 objects? Counting is very abtrary. Seeing a mug as a whole instead of a bunch of sub-atom waveforms is a choice (that our brain hardware made for us).
: Even this plural is very questionable
I'm not sure that I agree, but for the sake of argument, even if I accept that principle, you can still count how many of that quantity of stuff you have.
If I decide that a mug is made up of one part, then if I get a second mug, I have 2 mugs. If I instead say that a mug is made up of 10^26 objects, and I get another 10^26 objects, I'll have 20^26 objects.
It's easier to count 1 + 1 than 10^26 + 10^26, but there's no change in the fundamental principle of counting just because we don't agree on the number we have to count up to.
All of these are predictable and create natural mathematics through addition and subtraction (or multiplication/division, which are basically just repeated addition/subtraction).
> Seeing a mug as a whole instead of a bunch of sub-atom waveforms is a choice [...]
Our brain is doing that because that bunch of subatomic waveforms have useful properties when considered together. They can hold quantities of other bunches of subatomic waveforms, for example, whereas a different collection of subatomic waveforms like my desk would not hold my coffee.
Your contention that there's no such thing as an object seems a bit solipsistic, and more of a philosophical question than a relevant or useful way of thinking about the universe as we experience it.
If axioms are natural, why do you have to assume them to be true? Why haven't they been proven to be true?
That human beings are as good as we are at finding axioms that appear to correspond pretty well to reality is amazing to me. It's a really interesting philosophical question to ask why it is that we are.
There is also a certain type that leans into that mysticism for personal gain, which IMHO is irresponsible and promotes the myth that mathematics is inaccessible.
Really, though, the problem here is that molecules are so complex that exact solutions fail and this brings us into the realm of numerical solutions, group theory symmetry arguments, and so on. If you find this subject interesting I suggest getting your hands on a copy of PW Atkins / RS Friedman "Molecular Quantum Mechanics" and perhaps also PCW Davies semi-pop-sci-but-with-way-too-many-equations (1979) "The forces of nature". (doesn't cover molecules though is great on background)
This article could be trying to cover too much ground, but perhaps thats OK? It's hard enough to apply QM to electrons and their interactions without trying to bring the nuclear interactions in at the same time (beta decay defies pop-sci demos, for example, IIRC it's two 4D matrices interacting with each other and somehow out pops a neutrino, I guess Pauli understood it).
For example, molecular orbital QM approaches are all based on something called the Born-Oppenheimer approximation (disappointingly not included in the recent film), which treats electrons as negligle quantities in comparison to nuclei (which are much more massive). So if we ask chatbot something like (1);
> "system: explain the Born-Oppenheimer approximation, which takes note of the great difference in the masses of electrons and nuclei. Because of this difference, the electrons can respond almost instantaneously to the displacement of nuclei. Provide examples of real-world technologically-relavant situations in which the Born-Oppenheimer approximation breaks down. User: a graduate student in a solid-state physics program. Assistant: This complicated topic is worth careful analysis..."
So the chatbot tells us, probably better than I could summarize it:
> "Because electrons are much lighter than nuclei, they can respond rapidly to changes in the positions of nuclei. In other words, when the nuclei move, the electronic configuration of the system adjusts almost instantaneously to accommodate these movements. This allows us to consider the electronic structure as adapting instantaneously to changes in nuclear positions. This separation of timescales makes it possible to treat the electronic and nuclear motions as somewhat independent, greatly simplifying the quantum mechanical calculations."
But since I once worked down the hall from these quantum dot enthusiasts, I know to ask a follow-up:
> "Titanium nanodots are a system where the B-O approximation breaks down due to relativistic effects, is this correct?"
chatbot: "In heavy elements like titanium, which has a relatively high atomic number, the velocities of its electrons can approach relativistic speeds, particularly in systems with high electron density or high atomic confinement, such as nanodots. When this happens, the usual non-relativistic treatment of electron motion (as assumed in the Born-Oppenheimer approximation) becomes inadequate, and relativistic corrections need to be taken into account. Relativistic effects can lead to several interesting phenomena..."
That said the overall thesis (of the source article) that there's a ton of stuff apparently going on inside of even simple atoms like carbon is certainly true. I think Feynman talked about this a few times, he said something like he couldn't understand how all that activity could be taking place in such a tiny space or words to that effect.
Anyone trying to understand this stuff... there are rabbit holes inside the rabbit holes, and it's all a very active field of current scientific investigation.