
The Idea of Entropy Has Led Us Astray - dnetesn
http://nautil.us/issue/86/energy/the-idea-of-entropy-has-led-us-astray
======
dr_dshiv
Entropy is a difficult concept to understand. This was a well-written article
that explained nothing and further muddied the waters.

What is perhaps not appreciated is how the idea of entropy completely
transformed our society by unleashing massive amounts of mechanical
efficiency. This free energy is derived from _contrast_ \-- power comes from
flow across a gradient.

I wonder where the analogies work or where they fail. For instance, the
importance of contrasts seem to illustrate the value of cultural differences
(diversity) and raises concerns about a homogeneous world culture.

~~~
contravariant
The jump to diversity is a bit iffy. Even taking for granted that this
provides a similar gradient then all you can really guarantee is that it will
cause _something_. This could be both good and bad. Also if it works in the
same way then it will only work at the cost of diversity, the second law of
thermodynamics makes sure of that.

So yeah, I'm not sure _if_ your analogy works, and if it did I'm not sure if
it would be useful.

~~~
dr_dshiv
See below comment on structural holes. I completely agree with your assessment
of the uncertainty here. But knowing that

1\. Monoculture is dangerous

2\. Diversity will be celebrated for good political reasons and

3\. Unity in diversity is the classical notion of harmony and beauty

...this makes me hypothesize about a theoretical reason why cultural contrasts
could be something to value. It suggests that energy can be "harvested" from
flows across those cultural contrasts and that cultural contrasts require
energy to maintain. I think it would produce net value because... Idk.

(Note, I might be biased on this entire topic simply because I hate Facebook
and miss the highly diverse 90s web)

~~~
akiselev
_> ...this makes me hypothesize about a theoretical reason why cultural
contrasts could be something to value. It suggests that energy can be
"harvested" from flows across those cultural contrasts and that cultural
contrasts require energy to maintain. I think it would produce net value
because... Idk._

There is a reason to value diversity but it's as far from theoretical as you
can get: billions of years of evolution have demonstrated that diversity is
the only way for complex ecosystems (civilization, in our case) to _exist_ on
any significant timescale. Monocultures are dangerous because the second the
environment changes enough, everything dies pretty much all at once instead of
transitioning between a mix of cultures best suited to survive the change. I
think there's a world of research to be done on how those principles apply to
societies, which are driven more by artificial selection, but there's
definitely a connection.

~~~
Supermancho
> There is a reason to value diversity but it's as far from theoretical as you
> can get:

I'm not sure that's the takeaway. Diversity ensures change and optimization
(in a roundabout way). Stability in cultures is about application of
optimization and more often, culling. I guess case in point depends on your
time frame reference and what you mean by stability. Rome and the Aztecs wiped
out almost as much as they assimilated (and assimilation was generally via
enslavement, which is not quite the same).

------
apocalypstyx
Really, by the end, the author has reversed the situation to do the very thing
complained of: science used to assuage the mores and anxieties of a given
time. The issue, however, is that the underlying anxiety (whether in the
Victorian Era or now) is the same: the desire that the future be eternal and
the same as now, if not more so. So we don't even need to buy a new car or
leave our homes. If we do nothing, surely the world won't move beneath our
feet. Vast as it is, surely we can pretend the universe the opposite of a
closed system.

------
evdev
I was going to rant, but this:

> The energy-driven reduction of entropy is easy to demonstrate in simple
> laboratory experiments, but more to the point, stars, biological
> populations, organisms, and societies are all systems in which energy is
> routinely harnessed to generate orderly structures that have lower entropy
> than the constituents from which they were built. There is nothing
> physically inevitable about increasing entropy in any of these systems.

is so straightforwardly incorrect that it frankly just should not have been
published.

~~~
mtgp1000
I would argue that what the author said is somewhat true, in that life can be
roughly separated from nonlife by the observation that organisms do appear to
tend toward less entropy in a universe where everything else seemingly does
the opposite.

Genetically coded beings are orderly structures with far fewer microstates
than their molecular constituents would likely contain otherwise. And we tend
to make copies of ourselves, giving order to otherwise chaotic matter.

I think there's a profoundness in there, somewhere. Such a definition also
solves the virus conundrum!

~~~
macspoofing
>in that life can be roughly separated from nonlife by the observation that
organisms do appear to tend toward less entropy in a universe where everything
else seemingly does the opposite.

In a superficial way, I suppose, but it's still wrong. Another way to look at
it is that life actually tends to a more efficient increase of entropy than
otherwise would be expected (when compared to non-living processes). For
examples, humans are a complex chemical reaction that has reached the point
where it can release energy through splitting of atoms - which raises entropy
so much higher than it would be been possible otherwise and completely
impossible via non-living chemical reactions.

~~~
trgn
A squirrel will spend energy to collect nuts and bury them together in the
ground, rather than having them scatter and roll and blow willy-nilly. A
person is constantly sweeping up the dust inside their house, painting and
repainting the trim of their windows, and organizing the cables in their desk
drawers.

Life can most certainly be viewed as a counterforce to entropy. Certainly at
the philosophical level, but why not at the genetic level too, reproduction
being the repeated organization and duplications of chemical bonds from
smaller constituents.

I certainly see my life as a constant battle against entropy, an adult's life
consists pretty much 80% of putting things in things.

~~~
macspoofing
>Life can most certainly be viewed as a counterforce to entropy.

Sure, as long as we qualify the terms correctly. That is, you need to decrease
the resolution of what you mean by 'entropy' because each one of your examples
actually increased entropy moreso than inaction would have.

Regardless, this goes against the author's point, because in each case 'work'
needs to be done to reverse the entropy of a local system (e.g. scattered
nuts) at the expense of the larger system (squirrel heat emitted into the
universe)

>I certainly see my life as a constant battle against entropy, an adult's life
consists pretty much 80% of putting things in things.

Sure, with proper qualification that is one way to look at things. This works
because of the resolution that we care about. Namely, we don't care about heat
generated from our bodies, or smart phones, or nuclear reactors, accelerating
global entropy, but we certainly care about dusty rooms.

Again, it seems like the author disagrees with this view.

~~~
noobermin
>Regardless, this goes against the author's point, because in each case 'work'
needs to be done to reverse the entropy of a local system (e.g. scattered
nuts) at the expense of the larger system (squirrel heat emitted into the
universe)

I submit to you that you did not get the point the article is trying to make
because it was exactly this. When considering the animal expending work as the
system, it's entropy doesn't decrease because it isn't a closed system. You
can then retort that the 2nd law concerns a larger closed system, but you can
keep playing that game until the 2nd law essentially becomes a tautology, and
becomes useless in understanding the system at hand.

I don't know if the author made this point explicitly (he hinted at it at the
end), but one needs to actually know the details of the system under
consideration, and very general laws can bring some level of context but will
be limited in terms of the actual relevant or useful insight one can glean.

------
brummm
A non-physicist writing about entropy most often leads to a incorrect article
full of slightly incorrect statements and misunderstandings. Case in point
this article.

------
im3w1l
So one philosophical issue I've had with entropy is that it is based on
probability. But it is applied even to non-quantum systems that evolve in a
deterministic fashion. How can we talk about the uncertainty of a system that
is in a single, well known, predictable state? And with many-worlds
interpretation, even the quantum wavefunctions evolve in a deterministic
manner.

How can we reconcile this? We can say that the system is not probabilistic,
but it is _approximately_ probabilistic. If it is only approximately true,
does that mean we can find a loophole in the laws of thermodynamics?

~~~
Majromax
> How can we reconcile this? We can say that the system is not probabilistic,
> but it is approximately probabilistic.

It is indeed probabilistic _based on your existing knowledge of the system_.
The idea that (say) a roulette wheel evolves according to deterministic laws
of physics is not useful to you since you do not know the current micro-state
necessary to make the prediction.

This is similar to the difference between frequentist and Bayesian
interpretations of statistics. Bayesian statistics is more comfortable taking
probability distributions over notional facts (such as 'is this coin
unfair/weighted?') because it treats the experimenter's confidence as the
fount of probability.

If you treat entropy in the same way, it makes more sense. The entropy of a
system is related to the number of microstates it can occupy _consistently
with our known macrostate_ (e.g., measurement of temperature and pressure --
inherently averaged quantities).

~~~
im3w1l
The issue I have with this is that the system is in a given microstate. We may
not know which one, but there is one and exactly one. And it will evolve
deterministically from that one. And if we look at how it evolves it may do
things that seem very unlikely from a random macro point of view.

Assume for instance there is a macrostate with a huge amount of energetically
favourable microstates. But our little microstate could be destined never to
enter any of them.

Typically there is an assumption right, that any state could transition into
any state at a probability given only by the energies but that isn't really
true.

~~~
jbay808
Grandparent is right.

Entropy is best thought of as a two-parameter function.

Not entropy(system), but rather entropy(system, your knowledge of system).

The system is in a microstate, but you don't know which one. If you did, it
would have zero entropy. Since you only have a few general measurements like
"one litre of gas at room temperature and one atmosphere of pressure", the
entropy is much higher, because many microstates meet that description.

If you think of entropy as an objective property of the system regardless of a
given observer's state of knowledge, you'll likely get confused.

~~~
im3w1l
How can we connect this subjective entropy back to things like Gibbs/Helmholtz
free energy, or temperature (du/ds)?

~~~
jbay808
This way!

[https://bayes.wustl.edu/etj/articles/theory.1.pdf](https://bayes.wustl.edu/etj/articles/theory.1.pdf)

------
s9w
This article is the same handwaving anti-science hogwash routinely found in
philosophy departments that talk about quantum mechanics. They twist words and
force connections where there are none because they want to talk about things
they don't understand.

No, the laws of thermodynamics are not based on sociology.

------
Causality1
I don't think this article does nearly enough work to support its claim that a
misunderstood belief in entropy is responsible for human greed, a claim which
I think is dead wrong. Most people don't have a misunderstanding of entropy
because most people have no knowledge of the concept whatsoever.

You don't need the second law of thermodynamics to make human beings
shortsighted fools. We've been doing that for millennia.

------
kilpikaarna
Lol. Lots of comments getting all huffy at someone daring to look at physics
from a humanities perspective.

The basic idea is that the theory was worked out in a certain time and space
where the social conditions were suitable to it. And that they form a feedback
loop where the theory will reinforce those social conditions. It doesn't mean
that the theory is somehow "wrong", or that the equations don't make useful
predictions about observable systems. It really not that esoteric. Nor a
threat to any of the science itself.

Really, anyone throwing up their arms in disgust at this article would do well
to read up on some basic philosophy of science. Pick up an entry-level college
textbook. Hell, just look up the definition of "theory" and really think about
it for a few minutes.

------
at_a_remove
This bit "Thermodynamic theories served a society committed to laissez-faire
competition." is just ... I don't know where to begin. Of course it inevitably
leads into the idea that if we had just thought differently about the "sense
of energy" and whatnot, things would be different. It has the ominous hint of
Irigaray's bit about E=mc^2 being sexist because it "privileges" the speed of
light ...

No, physics is always there, and it doesn't matter if the original science was
done by cutthroat capitalists or gulag-exiling communists. If you want a
decent sociological perspective of the realities of physics, I would instead
suggest building on just how uncomfortable a given reality seems to make some,
everything else is window dressing and justifications.

------
seemslegit
Nautilus writers are the first to lose their jobs to GPT-3 and not have the
readership notice.

------
noobermin
A very good article, although I think the deeper point that isn't really said
explicitly is essentially how scientists (physicists in particular) are often
too fond of their models and forget that they are idealizations of the real
world. Whether a system really is isolated (they never are) is a serious
contingency scientists often forget, but we can extend this to essentially
forgetting the applicability of all hosts of accepted theory. This point of
view has been called reductionism by some philosophers, and it has value in
some cases but tends to overestimate the complexity of the real world.

It's important to remember that all models are approximations.

------
golemotron
I won't claim the author is mathematically illiterate. I don't know. But this
is where math shines. Show the equation and how it it models the phenomenon.
Match it against measurements. If it doesn't match, make another model.
Without that, talking about entropy as if it isn't a useful way of
conceptualizing phenomena is just word soup.

------
hummo56
I am still confused by the notion of entropy being unorder. I think there was
an article about this being a common misunderstanding.

~~~
sarah180
Here's the best way I know to explain it, paraphrasing Stephen Hawking:

Imagine taking a picture puzzle that you'd completed, putting it in your
clothes dryer, and turning on the tumbler. I think you and I can both safely
say that you'd get out a jumbled mess of pieces with little to no resemblance
to the completed puzzle.

Now imagine taking a disassembled picture puzzle and putting it in your
tumbler. You and I would both be very surprised if what came out was a
correctly completed puzzle.

Mathematically, there is only one way that the pieces of the puzzle can all
position themselves correctly to solve that puzzle. There are a huge number of
ways those pieces can position themselves that _don 't_ solve the puzzle.

This is, mathematically, what is meant by "order" and "disorder." It means the
system is in a configuration that is more– or less-constrained state.

A more classical version of this is to say that if you have a divided box with
oxygen on one side and nitrogen on the other, this has a lower entropy than
the same box where the oxygen and nitrogen are intermixed. Removing the
divider from a box where they're separate will lead to them mixing, but
removing the divider from a box where they're mixed will not lead to them
separating. There are more configurations where the gases are mixed than where
they are separate.

There are some subtleties when you look deeper at what it really means to be
in a constrained state and how this interacts with quantum mechanics, but this
is a reasonably accurate picture of how entropy works.

------
macspoofing
I'm scratching my head trying to understand what the author is trying to say.

It feels like he doesn't like the idea of entropy for political and
ideological reasons. Reminds me of Soviet support for Lamarckianism.

I take his point about misapplying scientific theories to socio-economic
policies, and entropic thinking certainly can be misapplied in domains where
it doesn't make sense ... but is that our problem today? Entropy is not part
of the cultural and mainstream consciousness like Darwinism is. If anything,
we don't take the idea of entropy seriously enough.

~~~
aeternum
It seems like he is arguing for more environmentalism because the less energy
we use, the longer it will take to reach the heat death of the universe?

It's a pretty ridiculous way of looking at things since any human activity is
still an exceedingly insignificant entropy gain, even if looking only at our
solar system.

~~~
fsckboy
not to mention and speaking as a human, human activity is the only justifiable
reason to support entropy gain. Like, what else is enthalpy worth to us? so
the universe can be here longer without us? (assuming we had any say in the
matter)

------
dehrmann
I've always found entropy to be one of the most depressing things in science.

~~~
noobermin
You should read the article then because his point is that things aren't
closed systems to begin with.

------
m12k
I'd say thermodynamics and entropy describe the woes of our time quite well:
Ecologically, we've seen that releasing sequestered carbon into the atmosphere
in order to extract its stored energy is a form of entropy that leads to a low
energy state which it takes a lot of energy to undo. It's like we've rolled a
rock down from the top of a mountain, only to realize how difficult it is to
get it back up, how long it had taken for nature to build up all that
potential energy over aeons. And economically, we're seeing that capitalism
itself seems to be a process of entropy, moving us toward low-energy states
with wealth concentrated in the hands of fewer and fewer, much like stars
forming from cosmic dust creating gravity wells into which other material
becomes more and more likely to fall. Not to mention between tax havens and
transfer pricing, we seem to be waking to the fact that our economic system is
not a closed system, but is slowly getting the fuel siphoned out of it.

------
epx
Disorder is the order we don't want :)

------
syrrim
A very boring article. The thesis is potentially interesting, but fails to get
any real attention from the author. The author proposes to investigate the
negative social and political consequences of the framing of the 2nd law of
thermodynamics, but puts no actual effort towards such an investigation.

Kelvin and Darwin are presented as equally consequent of a Victorian
worldview, whereas they seem to propose opposing tendencies. Darwin describes
a world of growth, and creation ex nihilo, whereas Kelvin describes a world of
cooling and decay. The social implications of the Darwinian thesis seem to
involve preferencing the creators and inventors of the world, while accepting
the loss of those who stagnate as not only inevitable, but necessary towards
producing growth. Kelvin's theory states that there is a limited amount of
free energy in the world, which we might take to imply that those with a
disproportionate share of resources are hoarding them and keeping them away
from others. This seems to promote a world of redistribution and
socialization.

An agrarian worldview would seem to preference a hotter world, since plants
rely on a high outside temperature to grow. We also know that photosynthesis
can be performed more efficiently at high CO2 concentrations. An entropic
worldview prefers colder external temperature, since efficiency of a heat
engine increases with temperature difference. I've seen proposed, as a
resolution to the fermi paradox, that certain civilizations are waiting for
the universe to cool down in order to perform computations more efficiently.
Such a proposal could only arise out of a worldview informed by entropy, of
waiting in order to more efficiently use the finite resources we have
available. The evolutionary worldview, by way of contrast, suggests that
waiting entails to stagnation, and ultimately destruction, that success goes
to whoever is most nimble.

I would percieve a congruency between a worldview informed by entropy and
modern liberalism, which seems to heavily favour the principles of waiting and
of redistribution. Conservatism seems more in favour of an evolutionary
worldview, wherever it believes that someone well off deserves what they have,
or wherever it perceives a competition between groups.

It should be clear that both worldviews have some relation with the real
world, that both decay and growth are happening around us constantly. I,
personally, would highlight the evolutionary thesis as more hopeful, in that
it imagines still greater things to come, rather than the slow decay
characteristic of the entropic model. I would also point out it likely has
stronger relevance for the foreseeable future, since the death of the sun,
much less of the universe, remains an inordinately long way off still. I would
therefore tend to agree with the thesis of the author, though for none of the
reasons they set out to defend.

