
Entropy as Disorder: History of a Misconception - Anon84
https://aapt.scitation.org/doi/10.1119/1.5126822
======
panic
I was looking for more information on the oil/vinegar situation and found this
explanation at
[https://chem.libretexts.org/Bookshelves/General_Chemistry/Bo...](https://chem.libretexts.org/Bookshelves/General_Chemistry/Book%3A_CLUE_\(Cooper_and_Klymkowsky\)/6%3A_Solutions/6.3%3A_Hydrogen_Bonding_Interactions_and_Solubility/6.3.1%3A_Entropy_and_Solubility%3A_Why_Don’t_Oil_and_Water_Mix%3F):

 _> When hydrocarbon molecules are dispersed in water, the water molecules
rearrange to maximize the number of H-bonds they make with one another. They
form a cage-like structure around each hydrocarbon molecule. This cage of
water molecules around each hydrocarbon molecule is a more ordered arrangement
than that found in pure water, particularly when we count up and add together
all of the individual cages! It is rather like the arrangement of water
molecules in ice, although restricted to regions around the hydrocarbon
molecule. This more ordered arrangement results in a decrease in entropy. The
more oil molecules disperse in the water, the larger the decrease in entropy.
On the other hand, when the oil molecules clump together, the area of “ordered
water” is reduced; fewer water molecules are affected. Therefore, there is an
increase in entropy associated with the clumping of oil molecules —a totally
counterintuitive idea!_

~~~
cma
What about with heavy gases and light ones like helium? Do they eventually mix
completely in large atmospheres under a big gravity gradient?

I would think a mixture should gain kinetic energy as the heavier one settles
more on the bottom, releasing photons from the heat which increases total
entropy, but in a closed system where the photons are reflected back it would
hit some equilibrium with heavier stuff at the bottom, but more kinectic and
photonic energy which maybe both together give more degrees of freedom than a
more evenly mixed mixture with less kinetic energy and higher gravitational
potential energy.

~~~
kgwgk
> I would think a mixture should gain kinetic energy as the heavier one
> settles more on the bottom,

But this is not what happens, or there would only be CO2 (and a small ammount
of the heavier atoms and molecules) in the lower part of our atmosphere.

> I would think a mixture should gain kinetic energy as the heavier one
> settles more on the bottom,

But this is not what happens, or there would only be CO2 (and a small ammount
of the heavier atoms and molecules) in the lower part of our atmosphere.

Edit: Maybe you know this already, but in equilibrium the mixture does indeed
have more kinetic energy (per unit of volume) as you go down because even
though the temperature remains constant the pressure increases.

~~~
cma
This is not the enclosed system case, but light gases do escape easier and
heavier gases do accumulate more predominately towards the ground:

[https://en.m.wikipedia.org/wiki/Atmosphere_of_Earth#Stratifi...](https://en.m.wikipedia.org/wiki/Atmosphere_of_Earth#Stratification)

At extreme elevations you find much higher concentrations of H and He.

It doesn’t fully separate like you might see with liquids, but there are
tendencies.

~~~
kgwgk
You’re right, the atmosphere is much more complex and dynamic than in these
idealized models. But assuming the system is closed and in thermal equilibrium
and that these are ideal non-reacting gases then gravity has an effect on the
density and pressure but not on the composition.

~~~
cma
I disagree. Assume it is a closed box reflecting all photons back inwards and
perfectly bouncing the gas particles. Now assume here are only two gas
particles in the box, a heavy atom and a light one. Assume the box is tall
enough that in the presence of gravity there is not enough total energy in the
system for either particle to reach the top of the box.

The light one will move faster than the heavier one on average when they come
into contact, and in the presence of gravity it will have a higher average
height. The same will hold as you add more particles, but there is a curve to
it.

It is true hat the atmosphere is more complex, and has things like ozone layer
causing temperature inversion due to different absorption characteristics,
etc., but the general reasons that H and He are so much more prevalent in the
upper layers is largely due to this kind of explanation using gravity.

~~~
kgwgk
It’s true, I don’t know what I was thinking. The barometric formula that gives
the density gradient for each (ideal) gas depends on the molecular mass so the
profile will be different and the composition of the mixture will vary with
height.

------
ScottBurson
Slightly OT, but I just saw this last night and thought it was hilarious:

 _You should call it "entropy", for two reasons. In the first place your
uncertainty function has been used in statistical mechanics under that name,
so it already has a name. In the second place, and more important, no one
really knows what entropy really is, so in a debate you will always have the
advantage._ — John von Neumann, suggesting to Claude Shannon a name for his
new uncertainty function, as quoted in Scientific American Vol. 225 No. 3,
(1971), p. 180.

~~~
hcs
Schrödinger used the term "negentropy".

And there's a story that Norbert Wiener used to wander around MIT telling
anyone who would listen that information is negative entropy, including
Shannon. (I'm having trouble finding it, will edit if I track it down)

Edit: ah, it was Fano who told that story (quoted in "The Cybernetics
Moment"):

> Electrical engineer Robert Fano at MIT, recalled that sometime in 1947,
> Wiener "walked into my office ... announced 'information is entropy' and
> then walks out again. No explanation, no elaboration."

More about Wiener's approach and another recounting of the story here:
[http://news.mit.edu/2011/timeline-
wiener-0119](http://news.mit.edu/2011/timeline-wiener-0119)

------
foxes
The definition of entropy from a statistical mechanics perspective, is roughly
a count of the number of different (microscopic) configurations, \Omega, of a
system. When all the configurations are equally probable, you get the formula

S ~ log(\Omega)

I don't think you see this definition in a physics education until you do a
course on statistical mechanics? I feel like I certainly had teachers say
"disorder". I think the connection being that if there are more configurations
it's more "disordered" which isn't really a precise thing.

Looking back, I think rambling about ideal gases/pressure/volumes is a boring
way to teach this subject (at least at late high school/first year uni). You
should probably introduce it by talking about configurations and information.
You can think of some very clear examples (configurations of a list of objects
for example, it doesn't necessarily have to be "physical"). Then maybe you
would go on to show that this is connected to macroscopic physical properties.

~~~
rstuart4133
> I don't think you see this definition in a physics education until you do a
> course on statistical mechanics?

I'm over 60, and I only recent discovered the definition of entropy recently
thanks to Leonard Susskind youtube videos. For 60 bloody years I only got the
dumbed down version about "disorder" and it made absolutely no sense to me
whatsoever. Now I see the same confusion in others - like here on HN were to
people where duking it out over whether one arrangement of a pack of cards
(all individually identifiable of course) had more entropy than the other.

This dumbing down does nobody any favors whatsoever. It certainly doesn't
promote understanding of the basic principles.

------
dvt
So it's an article on entropy (as disorder) without one mention of Shannon[1]?
This seems like a pretty big oversight.

In the context of information theory, entropy is _absolutely_ a decay of
information. I understand that this is physics (not math -- although the
formulae are virtually identical), but there's a very clear, accurate, and
formal use of "entropy" to mean exactly what the author purports it _doesn 't_
mean: chaos.

[1] [https://heliosphan.org/shannon-
entropy.html](https://heliosphan.org/shannon-entropy.html)

~~~
dr_dshiv
I don't think it's an oversight. Shannon entropy, Von Neumann entropy and
other information theory entropies are useful, but not physically fundamental
in the sense of the second law of thermodynamics.

That said, I don't under whether Gibbs entropy or Boltzmann entropy is
fundamental-- ?

~~~
dvt
> Shannon entropy, Von Neumann entropy and other information theory entropies
> are useful, but not physically fundamental...

I have to disagree, especially given that one of the great mysteries of our
generation (the Black Hole Information Paradox) hinges on what happens to
_information_ \-- sure, it's information about particles, whatever that might
mean. But Jacob Bekenstein (among others) argues that Thermodynamic entropy is
analogous to information (Shannon) entropy[1].

[1]
[http://webhome.phy.duke.edu/~mkruse/Phy255/255S06_Joyce_Copp...](http://webhome.phy.duke.edu/~mkruse/Phy255/255S06_Joyce_Coppock_bh.pdf)

~~~
kgwgk
[http://backreaction.blogspot.com/2019/08/how-do-black-
holes-...](http://backreaction.blogspot.com/2019/08/how-do-black-holes-
destroy-information.html)

“As you have probably noticed, I didn’t say anything about information. That’s
because really the reference to information in “black hole information loss”
is entirely unnecessary and just causes confusion. The problem of black hole
“information loss” really has nothing to do with just exactly what you mean by
information. It’s just a term that loosely speaking says you can’t tell from
the final state what was the exact initial state.“

~~~
abdullahkhalids
This is not the consensus opinion in the physics community. Many physicists
consider information as fundamental to this "paradox".

~~~
kgwgk
Right. The other view isn’t the consensus either. I was just presenting an
alternative exposition.

------
dr_dshiv
I liked this article. Really clear. I found some of my own misconceptions. For
instance, that uniformity and gradients are ways of characterising entropy.

The example of oil and water is great. While they are separated with a
gradient, there is no latent tendency to flow together and mix.

But, it's only gravity that makes the oil and water issue hard to understand.
With no gravity, they would not be evenly divided.

We still don't understand gravity, as it relates to entropy. Usually entropy
makes things spread out, but gravity makes things come together. Weird.

~~~
Florin_Andrei
I dunno. I always thought "entropy = disorder" was a big simplification,
almost in the realm of lies to children. Entropy is what it is; the pattern
detection algorithms in our heads are not always very good at judging global
properties such as order.

I do have a degree in Physics, so perhaps that's the source of this belief
I've held - probably from countless impromptu discussions with fellow physics
geeks. You wanna know the entropy of this system? Calculate it, don't eyeball
it.

~~~
novaRom
How to define disorder? I always struggle to understand that. Less
compressible state?

~~~
dr_dshiv
More potential for free energy flow across a gradient. That's what I still
associate with entropy

------
tullatulla
The author seems to mix up Gibbs free energy with entropy. The remark that
thermal equilibrium equals maximum entropy is simply wrong. All examples can
be explained with minimization of the free energy:

dG = dH - TdS

dH == internal energy (chemical bonds) + pV dS == Entropy term T ==
Temperature

Where entropy is the number of different states the system can be in for a
given internal energy. A thermodynamic system in equilibrium will always tend
to a state of lowest free energy.

Example: a perfect crystal would be a totally ordered state (and with the
lowest entropy), with the internal energy minimized due to the highest number
of chemical bonds. For T -> 0, this would be the thermodynamically most stable
configuration (however not always kinetically accessible, e.g. supercooled
water). As Temperature goes up, the Gibbs free energy of a more disordered
phase (e.g. crystal defects or a liquid/gas phase) becomes lower and will
eventually lead to the melting of the crystal.

For the oil/water example, the separated state minimizes the free energy since
polar water molecules can form hydrogen bonds with themselves (lowering the
internal energy), and push out the oil molecules, even though the entropy is
lower than for a totally mixed state. At higher temperatures, this will change
theoretically and the mixed-phase becomes favorable thermodynamically.

From this point of view, seeing higher entropy as more disorder seems
absolutely fine, where more disorder is "lower chance of guessing the exact
configuration of all molecules in the system".

~~~
kgwgk
Entropy is maximized for an isolated system (constant energy), free energy is
minimized for a system in thermal equilibrium with a heat bath (can exchange
energy with the environment). Gibbs free energy is minimized at constant
pression and temperature, Helmholtz free energy is minimized at constant
volume and temperature.

If you put now the oil/water system into a isolated container the equilibrium
state won't change substantially. It remains separated in two phases and
that's now the maximum entropy state.

------
trhway
Typical fallacy in entropy considerations is to consider system as isolated in
the model while it is strongly connected to the environment in reality. The
oil/vinegar (similar to the pennies and dimes mentioned in the article - when
shaken in a jar would "magically" separate with pennies on top) has exported
the entropy to the environment through the work of gravitational force in full
accordance with the 2nd law - the system transitions following the gradient
and thus increasing total entropy of the whole system. A similar mistake for
example is when "life" is stated to be "anti-entropic" in any sense while it
is a really a form of matter organization which generates even more entropy
than the same amount of dead matter would.

------
szemet
Isn't this a too narrow definition of entopy? Ok. Pure physical entropy was
the first and original usage, but now informational entropy can be used at
higher level, and it correlates well with disorder at that level.

If I sort an array its entropy will be lower - while due to thermodynamics the
total entropy of the world will be increased.

So there is the only the high level component I'm interested in: I see less
macrostate in a sorted array(one) than in an undordered(n factorial) even if
the microstates in the world have increased(I don't care).

Same goes for the article examples, for me subjectively an ice statue can have
many different forms (macrostates), if it melts it will have one form (a
puddle) - so I see the entropy to decrease - of course the puddle have more
microstates than the ice statue - just I do not differentiate between them...

True: in the sbove examples things became more ordered at high level (entropy
have decreased), meanwhile physical entropy have increased - but that increase
was global and in our local point of interest it is still can be said
(according to modern usage) that entropy have decreased!

Of course the total sum have to be positive - that's the basis of how for
example the Landauer principle is calculated, wich connects the above high
level informational view with physical entropy...

~~~
kgwgk
Note that this is an article for physics teachers and the conclusion is “we
can warn our students that this is not the meaning of the word “entropy” in
physics.”

------
kgwgk
A couple of related articles from Lambert
([https://en.m.wikipedia.org/wiki/Frank_L._Lambert](https://en.m.wikipedia.org/wiki/Frank_L._Lambert)):

Shuffled Cards, Messy Desks, and Disorderly Dorm Rooms — Examples of Entropy
Increase? Nonsense!

[http://entropysite.oxy.edu/shuffled_cards.html](http://entropysite.oxy.edu/shuffled_cards.html)

Disorder — A Cracked Crutch For Supporting Entropy Discussions

[http://entropysite.oxy.edu/cracked_crutch.html](http://entropysite.oxy.edu/cracked_crutch.html)

------
bonoboTP
I would say, in itself the oil being separated from and being above the
Italian salad is something that in itself decreases the entropy (for now I
assume we can think of this as two gases of different density). However! It is
overcompensated by other factors and in total the entropy is lowest in this
case.

What factors? Imagine you could place the particles as you wished, and being
encouraged by my first sentence you decide to mix it all up to achieve higher
entropy. To make it all mixed up, you'd have to move the heavier, denser
material towards the top, increasing the total potential energy. This would
need the introduction of external energy into the system. We cannot do that
with a closed system. We could only take the energy as heat from the salad,
making it all colder. However, in making it colder, the motion of the
particles becomes less uncertain, leading to a _decrease_ in entropy.
Ultimately the method of mixing it up fails to increase the entropy, showing
that the separated configuration is indeed lower entropy.

However it is not because of the separation, but in a certain sense despite
it. The separation helps bring the particles into a higher inner-energy state,
because the potential energy decreases as the denser material moves down.
Without paying attention to the micro situation the effect wouldn't make
sense.

~~~
kgwgk
> (for now I assume we can think of this as two gases of different density)

What does that mean? If they were gases they would mix (as the atmosphere,
where you don’t see layers of gasses of different densities but an essentially
homogeneous mixture).

~~~
bonoboTP
That could be wrong.

I think the point still stands that the separation contributes to a decrease
in entropy, however it enables another effect which increases entropy even
more.

If we were allowed to mix it up evenly by adding energy from the outside to
lift the heavier material up (constant temperature), then we'd get an even
higher entropy, agree?

~~~
kgwgk
I don’t understand what point you are trying to make. If you assume that’s the
equilibrium state and therefore the highest entropy state given the
constraints on the system then of course the entropy cannot spontaneously
increase. And if it’s the highest entropy state despite looking “low entropy”
under the flawed criteria of “looks ordered” then it’s obvious there is
something else we’re missing that explains why the entropy is indeed maximal.

Anyway, to increase the entropy you don’t just need to add energy, it has to
be in the form of heat (something you can’t extract work from later). If you
move the “heavy” layer on top of the “light” layer the (potential) energy has
increased but if the temperature remains constant the entropy doesn’t change.

~~~
bonoboTP
I'm sorry, it is difficult to express my thought, although it is a simple
thought that may be trivial to you.

I mean that ceteris paribus the layering (separation of materials) causes a
decrease in entropy. However, the system still does this, because it gains
more entropy through a different "channel".

If we could destroy the layering, thereby evenly mixing things up, but not
changing anything else that affects entropy, we could increase the entropy. So
it show that the layered version is lower entropy as long as all else is the
same. The catch is, this would require adding extra energy into the system,
because the mixed version has more potential energy. We could also imagine a
mixed up configuration that has the same total energy, but then the
temperature would be lower since more of the energy would be used up as
potential energy, leaving less for heat. In that case the entropy would be
lower.

Main point: the layering, just the fact that it is layered, this property, if
we know nothing else, is still something that is a low entropy thing. So our
intuition is not broken that order corresponds to low entropy.

~~~
kgwgk
> I mean that ceteris paribus the layering (separation of materials) causes a
> decrease in entropy.

But it doesn’t, which is the reason why it happens. Because that intuition of
“oh, there are two phases, this means order and entropy is disorder so the
entropy is lower” doesn’t correspond to the reality of the physical concept of
entropy (the thing that is maximized for a thermodynamic system In
equilibrium). That’s the point of the article.

Note that the discussion about potential energy is a red herring, phase
separation does also happen in absence of gravity (you get droplets which
don’t coalesce as easily, but it happens nevertheless).

~~~
bonoboTP
That's why I tried to shift focus to density based layering rather than the
more complicated oil water interaction.

And the reason it happens is because it doesn't magically get the
corresponding external energy. However if we artificially, like gods, arranged
everything to be non layered but the same temperature, the entropy would be
higher than if layered and same temperature.

~~~
kgwgk
No, the entropy would be lower and the phases would separate spontaneously
because that would increase the entropy. The only reason you get layers in the
first place is that oil and water don’t mix! Water and alcohol also have
different densities.

------
codesushi42
The best book on the subject:

[https://www.amazon.com/Entropy-Demystified-Second-Reduced-
Re...](https://www.amazon.com/Entropy-Demystified-Second-Reduced-
Revised/dp/9812832254/)

