
Battling Entropy - galfarragem
https://fs.blog/2018/11/entropy/
======
saagarjha
This article begins with an analogy to the thermodynamic concept of entropy,
and then attempts to relate it to other kinds of "entropy" by making the
simplification that entropy is equivalent to disorder. That's all fine and
well, but then I really thinks it goes too far: you really can't try to
"reduce entropy" in your business by looking at a physical process, nor is
coughing "the transfer of energy as heat". It's getting dangerously close to
the kind of pseudoscientific platitudes that ended up making something like
this marketing description of the Pepsi logo, which "uses" general relativity
in its description: [https://www.goldennumber.net/wp-content/uploads/pepsi-
arnell...](https://www.goldennumber.net/wp-content/uploads/pepsi-
arnell-021109.pdf)

~~~
esalman
Interestingly, biological beings seems to defy the second law by working to
reduce disorder, or surprises, in the sensory system. This is the meat of Karl
Friston's Bayesian hypothesis of the brain.

~~~
infinite8s
You forgot to include the sun in your analysis.

~~~
carapace
And the rest of the sky. ;-)

------
ForHackernews
> Let’s imagine that we start a company by sticking 20 people in an office
> with an ill-defined but ambitious goal and no further leadership. We tell
> them we’ll pay them as long as they’re there, working. We come back two
> months later to find that five of them have quit, five are sleeping with
> each other, and the other ten have no idea how to solve the litany of
> problems that have arisen. The employees are certainly not much closer to
> the goal laid out for them. The whole enterprise just sort of falls apart.

Is that true? I mean, it sounds intuitively appealing (especially if you fancy
yourself a boss-type) but has anyone actually done this experiment?

Maybe some of those 20 people are ambitious and take it upon themselves to
lead the project, maybe they all form a self-organizing collective and make
sensible decisions by consensus, maybe they hold a vote to elect a _de facto_
CEO...

~~~
gdhbcc
There was a german entrepreneur who did something like that. He picked a bunch
of young software developers and gave them an office and budget to do what
they like with it. It didn't end well, they spent the budget on gaming chairs
and spent most of their time playing video games

------
wiredfool
Sonnet against Entropy

    
    
      The worm drives helically through the wood
      And does not know the dust left in the bore
      Once made the table integral and good;
      And suddenly the crystal hits the floor.
      Electrons find their paths in subtle ways,
      A massless eddy in a trail of smoke;
      The names of lovers, light of other days
      Perhaps you will not miss them. That's the joke.
      The universe winds down. That's how it's made.
      But memory is everything to lose;
      Although some of the colors have to fade,
      Do not believe you'll get the chance to choose.
      Regret, by definition, comes too late;
      Say what you mean. Bear witness. Iterate.
    

John M Ford (originally here:
[http://nielsenhayden.com/electrolite/archives/003789.html](http://nielsenhayden.com/electrolite/archives/003789.html))

------
crimsonalucard
People think that entropy increasing is basically atoms going from a state of
organization to a state of random distribution.

All you need to do is look at the universe to see how off this definition is.

The universe started as a big bang: a soup of randomly distributed particles.

Then the atoms proceeded to self organize into perfect spheres called planets
and stars which in turn organized themselves into flat ordered spiral
structures called galaxies.

If entropy is increasing always how does this happen? It happens because your
understanding of entropy is off.

~~~
dougk16
I've always considered the "increasing disorder" definition of entropy to be a
simple analogy that most people can relate to in their everyday lives, not
necessarily a formal definition. Indeed I would be interested to know if you
can even formally define order vs. disorder since it seems to require a
subjective observer. For example the universe doesn't consider a glass sitting
on the table to be "more ordered" than one that has shattered on the floor.
It's just like, our _opinion_ man.

But even if you use the order vs. disorder analogy, it's arguable that the
universe was close to perfectly ordered right after the big bang. After all it
was a pretty smooth and uniform distribution of matter/energy at some point,
and that seems more ordered to me than what we have now with all these random
clumps of star junk everywhere :)

Anyway I like to consider entropy as the amount of usable energy in a system.
A whole glass has more energy to release (e.g. by shattering) than a shattered
one does. Maybe this _is_ the formal definition of order vs. disorder?

~~~
crimsonalucard
"Useable energy" is vague word and thus not a good description.

There is a quantitative definition of entropy but it is based in a way off of
"opinion". Given a system and its laws look at all possible final states. Then
group the states according to an arbitrary "rule"

For example: the state of white marbles and black marbles in a jar.

What is the number of possible arrangements of all black marbles to be
arranged on the left side of the jar and all white marbles to be arranged on
the right? This is an arbitrary "rule." It's a big number but that number is
much lower than every other possible arrangement.

From these quantities you can derive an entropy value.

However you will note that it depends on that "rule" you define. I could point
to the state of marbles in the jar after I shake it and call that my "rule"
and it makes that arbitrary state one of low entropy. I could point to any
arbitrary state and do this and thus any specific state is one of low entropy.
It's a complex definition and it encompasses "opinion" and "choice." The
definition of entropy allows for an numerical value to be derived based off of
your "opinion."

In Boltzmann's definition, entropy is a measure of the number of possible
microscopic states (or microstates) of a system in thermodynamic equilibrium,
consistent with its macroscopic thermodynamic properties (or macrostate).

The macrostate is basically what I defined above as "rules" and what you
defined as "opinion."

------
Rainymood
>The second law of thermodynamics states that “as one goes forward in time,
the net entropy (degree of disorder) of any isolated or closed system will
always increase (or at least stay the same).”[1] That is a long way of saying
that all things tend towards disorder. This is one of the basic laws of the
universe and is something we can observe in our lives. Entropy is simply a
measure of disorder. You can think of it as nature’s tax[2].

I don't understand this. Please note that I've never had any physics training
so I'd love for someone with a more formal physics education to help me out
here. Why is this a measure of "disorder"? In my eyes, a perfectly uniformly
distributed of zero temperature mass (or maybe absolute zero, not sure) of
grey energy is the perfect order?

The fact that stars are forming together and creating heat -- that -- is the
disorder in my eyes. We are all disturbances in the energy of the universe. I
would call a perfectly uniformly smoothed out heat-death of the universe
perfect order (i.e. the opposite of chaos).

Or am I completely wrong here?

~~~
SEMW
You're making the (very common) mistake of taking a poetic English-language
interpretation of a physical law -- which is necessarily lossy/imprecise,
since the actual law is mathematical -- and over-interpreting the words
without referring back to the actual law.

You can see the law (in the context of thermal & statistical physics) at
[https://en.wikipedia.org/wiki/Entropy_(statistical_thermodyn...](https://en.wikipedia.org/wiki/Entropy_\(statistical_thermodynamics\)).
You can of course argue that 'disorder' isn't a good way of translating the
maths into English, but shrug, it's the one people seem to have settled on.
And of course, that's orthogonal to whether the actual law is true or not.

~~~
eru
'Information' is another way to translate. See
[https://en.wikipedia.org/wiki/Entropy_(information_theory)](https://en.wikipedia.org/wiki/Entropy_\(information_theory\))

------
stevedekorte
" Too little autonomy for employees results in disinterest, while too much
leads to poor decisions."

The author appears to confuse autonomy with a lack of feedback mechanisms.
Economies are based on a vast number of autonomous agents (independent
organizations) who are selected for survival via market feedback (prices) and
the long term trend in these has been towards ever greater agent granularity
(vertical disintegration).

------
bocklund
The definition of the second law of thermodynamics at the top is not right. A
closed system can have energy put in from the surroundings to decrease
entropy. Only isolated systems that cannot exchange energy with the
surroundings follow the second law. Only the universe is truly isolated (see
the common definition of the second law),

You decrease the entropy (disorder if you like, or not) of things by putting
energy in.

~~~
rocqua
> A closed system can have energy put in from the surroundings to decrease
> entropy.

Then it wouldn't be a closed system anymore.

What you call an isolated system is the same as a closed system. At least, in
the common parlance of thermodynamics.

~~~
bocklund
Closed means mass cannot go in or out. Heat and work can still be exchanged.

See
[https://en.m.wikipedia.org/wiki/Isolated_system](https://en.m.wikipedia.org/wiki/Isolated_system)

------
dr_dshiv
Entropy has been increasing since the big bang, when it was at minimum value.
While the total energy didn't change since then, there has been a vast
increase in the potential locations of that energy and a vast increase in the
number of different interactions possible within that energy. This increases
entropy because the possibility space has increased.

Boltzman entropy is defined as the number of potential microscopic
possibilities that could equivalently produce the measurable macrostate which
is observed. (No wonder people just call entropy disorder -- it's
complicated). So, the entropy of a glass of water consists of all the possible
combinations of position and momentum that all the water molecules could be
in. When the water is frozen, the possibility space is smaller than when the
water is hot.

[https://en.m.wikipedia.org/wiki/Entropy](https://en.m.wikipedia.org/wiki/Entropy)

How does this relate to order? I think of this in terms of the likelihood of a
random transformation of the system to affect it's functional interaction with
other systems. So, let's say you randomly transform the position or momentum
of a part of a house, it is likely change functional interactions of people in
that house. However, when the same material in the house is just sitting in a
pile of rubble, it has more entropy because a random transformation of the
rubble won't really affect the function of the rubble -- it is still, for all
intents and purposes, the same pile of rubble (even though it would take just
as many bits of information to describe with precision).

I'm not a physicist, but a humanist who thinks it is important to understand
physics. So open to correction here.

~~~
kgwgk
Thermodynamical entropy is not a physical property of a system. It’s a
property of our description of the system as a macroscopic state.

Quantum (von Neumann) entropy is a related but different concept. It’s worth
noting that it is constant for a closed system.

Cosmological entropy can be defined in different ways. In summary, entropy
means many things and not all “entropies” behave in the same way.

~~~
eru
Interesting enough, even though entropy is in some sense a property of a
description of a system, there are things like 'entropic forces' and they play
a big role in physics.

[https://en.wikipedia.org/wiki/Entropic_force](https://en.wikipedia.org/wiki/Entropic_force)

~~~
kgwgk
Entropic forces are not more real than centrifugal forces or the Coriolis
force. They appear when we use a thermodynamic description of a system.

That wikipedia page doesn’t make much sense, but note that the “mathematical
formulation” is about macrostates and canonical ensembles.

See also [https://johncarlosbaez.wordpress.com/2012/02/01/entropic-
for...](https://johncarlosbaez.wordpress.com/2012/02/01/entropic-forces/)

~~~
eru
Yes, no more and no less.

------
rubenhak
This was good read

