entropy is phenomenon of probability. It works because probability works.
People think entropy is some natural phenomenon in nature that is fundamental.
No. Entropy is a consequence of probability. The math works because probability happens to apply to nature. The below is the intuition behind why entropy occurs... once you will realize this it will make sense.
Disordered states tend to be more numerous then ordered states that's why any random configuration of gas particles in a system is more likely to be disordered.
Also when you try to purturb the state of random particles from state1 to state2, by probability state2 will be disordered because there are far more possible disordered states than ordered states... even when the initial state is ordered. Hence entropy increases.
The philosophical thing people should be examining is the nature of probability and reality which we already know is intrinsically tied with the nature of fundamental particles.
Probability is an axiom, entropy is a theorem derived from the axiom of probability, the fact that it is called a "law" seems like it's fundamental but it's not, entropy can be derived assuming probability is true.
this seems like a good angle with which to approach the question. To my way of conceptualising, Entropy is something like the inverse of data. What makes one state less ordered than another? The more ordered state requires less data to be fully described.
In that sense there is an underlying reality to entropy, but it's impossible to be sure, an ordered state could seem disordered if we lack the data that would describe it. I think we understand entropy to the same degree we understand data, and as data science develops I hope we'll develop a better understanding of entropy as a consequence.
Your intuition gets you to the right place but I think it is a bit flawed.
Entropy must be described relative to a system and an arbitrary definition of macrostates and microstates. It actually has nothing to do with disorder or order, I used the term previously because it helps with intuition but it is actually categorically wrong.
For example for a system of 5 loaded dice that roll six 99% of the time. With a microstate defined as the value of each dice after a roll and a macrostate defined as the number of 6s.
With this system entropy goes up as you roll more sixes. Order also goes up with entropy. Entropy is an arbitrary concept that is defined relative to your choice of a "system", "macrostates" and "microstates." You can choose systems that have higher probabilities of being ordered like say magnetic cubes in a box versus regular cubes. The magnetic cubes are more likely to be stacked perfectly and that is defined as a higher entropic state even when there is "less" information needed to "describe" it.
I never read about information theory but I'm assuming that the choice of "system" in information theory is usually pretty simple as in you can have numbers like the dice, but you usually don't work with "loaded" dice. So under the case where the microstate of each entity has equal probability of occuring. In this case there is a direct relationship between whether data describing something can be compressed and the entropy of of that something. The higher the entropy the less it can be compressed.
Information places idealistic restrictions on microstates... However this is not the case in nature. If the microstate of our universe of atoms is defined as the cartesian coordinates of each atom. Then atoms have a tendency to coalesce into spheres (planets, starts, black holes) due to gravity and as a result each microstate does not have equal probability of occuring.
Planets are in fact a higher entropic state then the cloud of dust that the planet initially started out as. I can now describe all those atoms with in compressed form (a macrostate): "Planet." Thereby leading to an opposite relationship between compressibility and entropy. In this case the higher the entropy of a system the less information is needed to describe it.
Interesting. So when you say that entropy is an arbitrary concept, is it simply determined by the likeliest macrostates? So in the case of planets vs dust clouds, would we say that planets have higher entropy than dust clouds because entropy is defined as increasing with the likeliest states and we know that planets are likelier?
To put it another way, could someone us their understanding of entropy to make predictions about a system's probability distribution beyond what they already know of it? Or is the entropy of a state purely defined by its proximity to a basin of attraction?
I'm reminded of a debate in physics when the time light takes to travel between fixed points changes - a minority prefers to change the definition of C in m/s, while most prefer to change the value of m in the portion of the universe we're in at the time. To ask the same question in yet another way, Is Entropy like C to the majority? If it appears to decrease, it's our understanding of probabilities for the system that need to be reworked?
>To ask the same question in yet another way, Is Entropy like C to the majority? If it appears to decrease, it's our understanding of probabilities for the system that need to be reworked?
You will see it is inline with what I'm talking about. Entropy is an arbitrary concept defined relative to your choice of a system (entities axioms and theorems) and arbitrary macrostates and arbitary microstates defined for the system.
Nobody has this down, they all talk about entropy without defining what kind of entropy they're talking about. Usually it's a half baked definition involving temperature and conservation of energy.
In general when you add conservation of energy into your system, microstates and macrostates then it becomes more inline with the popular notion of entropy. Because for atoms to self organize into planets they must stop moving, but because of conservation of energy that motion must be transferred into something else. So in general self organization in one part of the universe must mean another part of the universe gets hotter.
If I define a macrostate and microstate to account for conservation of energy meaning that my microstate must make sure that if a particle stops moving another one must start moving then things usually fits the notion of getting more disordered over time.
If I define the microstate to be just the position of atoms then things appear to become ordered over time as the position of atoms coalesce into spheres.
So there's different perspectives on it and all perspectives are true. It's just one perspective is examining the phenomenon of conservation of energy the other is not.
People think entropy is some natural phenomenon in nature that is fundamental.
No. Entropy is a consequence of probability. The math works because probability happens to apply to nature. The below is the intuition behind why entropy occurs... once you will realize this it will make sense.
The philosophical thing people should be examining is the nature of probability and reality which we already know is intrinsically tied with the nature of fundamental particles.Probability is an axiom, entropy is a theorem derived from the axiom of probability, the fact that it is called a "law" seems like it's fundamental but it's not, entropy can be derived assuming probability is true.