Hacker News new | past | comments | ask | show | jobs | submit login
The Second Law of Thermodynamics (2011) (franklambert.net)
145 points by luu 9 months ago | hide | past | favorite | 67 comments



As another comment mentioned, this website does look like Time Cube at first sight.

However, the explanations of the second law of thermodynamics on the second page are quite decent and written in a humorous way. Of course it is not fully accurate because it does not use any math but I think it does a good enough job of explaining it to the lay person.

The explanations about human life at the third page are analogous at best. The situations that the author describes are similar to the workings of the second law but not a first principles outcome of it.


> Don't put me down. I could have snowed you with differential equations and diagrams instead of what you see everyday. We're being practical and visual rather than going the math route, essential as that is in chemistry.

> The big deal is that all types of energy spread out like the energy in that hot pan does (unless somehow they're hindered from doing so) They don't tend to stay concentrated in a small space.

I am trying to loosely connect big ideas here so I might be wrong. If there is fundamental feature of a universal law, then that feature must manifest itself at all scales, as the above statements tries to put forward visually. Maybe this idea of flow spreading out is very general and some kind of summarization of that finer grain flow to coarser flow in the form of Green's theorem or Stoke's Theorem is very general.

Kinematic Flow and the Emergence of Time

https://arxiv.org/abs/2312.05300


This is what confuses people. There is this universal law, but you already know about it.

It's probability. Increasing Entropy is a result of probability. That's all it is.

When you have a bunch of particles and you jostle the particles it is MORE probable for the particles to become spread out then it is to become concentrated in one corner. That probability is what is behind this mysterious force called entropy.

Why is it more probable? You just count the amount of possible states. There are MORE possible "spread out" states then there are "concentrated states". In Most systems there are more disorganized states then there are organized states.

Think of it in terms of dice. If you roll 10 dice, how likely are you to get some random spread of numbers vs. all the numbers concentrated on 6? Or all numbers concentrated on 1?

It's more probable to get a random spread of numbers because there are astronomically more possibilities here. For all numbers concentrated on 1,2,3,4,5, or 6 you only have a total of 6 possible states, all ones, all twos, all threes... all sixes... that's total six states.

Random spread occupies 46650 possible states (6^6 - 6). Hence by probability things are more likely to become disordered and spread out simply because there are more possible disordered states.

Entropy is a phenomenon of probability. People mistake it for some other fundamental law that mysteriously occurs. No it's not, it makes sense once you understand probability.

The real question is, what is probability? Why does it happen to work? Why does probability seem to follow an arrow of time, it doesn't seem symmetrical like the rest of physics.


It makes even more sense when you take the law of large numbers into account. The scale we're experiencing most things on is /so/ far removed from the scale on which these probabilities are being expressed.

There are more molecules in a cup of water (on the order of 10^24) than there are cups of water in the ocean. If you have a cup of water's worth of matter, you aren't just rolling 10 dice (or even 1000 dice) and looking for mostly 6s. You're rolling a few septillion dice and hoping for a significantly non-normal distribution. It just isn't feasible.


I don’t buy it. You can’t say entropy is probability and then say but we don’t know what probability really is. It’s both foundational to physics and computer science. I could say probability is really just entropy just as easily. I would go farther and say that time is entropy as we measure time by observing entropy.


you don't buy it? This is foundational. This isn't something I'm making up. It's the formal definition of entropy.

https://www.labxchange.org/library/items/lb:LabXchange:ac117....


Thanks for linking that. My point really was that entropy is many things and probability is among them.


I think his point is that it is not clear why every microstate is equally probable.


It's an assumption we make due to imperfect knowledge of a system. We assume all microstates have equal probability.

Just like rolling 6 dice. We assume all configurations of the 6 dice have equal probability.

What's the probability of rolling exactly 1,1,2,6,3,5? It's the exact same probability as rolling 6,6,6,6,6,6. We instinctively assign these probabilities based of of assumption because we also assume each number has a 1/6 probability so rolling exactly a certain number for each roll yields (1/6)^6

It's the macrostates that are subjectively grouping these microstates in various ways. If I pick the macrostate where all dice are the same there's only 6 microstates for that. If I make up the macrostate for at least one dice is different that's every possible microstate except 6 microstates where they are all the same.


this is great and it does make perfect sense to me, someone with no stat thermo background. but it makes me wonder

how is it then, that things are becoming more spread out over time?

what is the property about the past, that it seems have a lot of uncommon states and not a lot of the common ones?

if common states are mathematically more likely to be common, why is it that the future has them and the past does not, in general?

like, why hasn't heat death happened? the probability thing seems to be almost proof-like, you cannot really argue against it. but clearly, the universe did, to a great degree, at some point in the past. why?


> how is it then, that things are becoming more spread out over time?

This may help: https://www.researchgate.net/figure/Classical-evolution-in-p...

If we knew the present we could predict the future - at least in classical physics. We have a very approximative knowledge of the present though, based on a macroscopic description. We can still predict a range of outcomes and see what it means in macroscopic terms. The initial set of states consistent with what we know “spreads out” as time goes by. We “lose” information by keeping only a “coarse” macroscopic description.


> what is the property about the past, that it seems have a lot of uncommon states and not a lot of the common ones?

This is called as Past hypothesis.

https://en.wikipedia.org/wiki/Past_hypothesis


>how is it then, that things are becoming more spread out over time?

It's more likely for things to spread out then to concentrate in one corner when you randomly move all particles in a box. If all particles moved to one corner of a box you would assume there's an intelligence at work moving the particles because such movement is too low of a probability to happen without intelligent intervention.

>if common states are mathematically more likely to be common, why is it that the future has them and the past does not, in general?

Common states are called "common" because their are more of them in general. Think in terms of things with a few states like rolling dice. You have a machine that rolls rice continuously and checks the result. Use this as an analogy of particles of gas moving around in a box and then use that as an analogy of the universe.

Nobody knows why probability works this way. Probability is what differentiates the arrow of time.

>like, why hasn't heat death happened? the probability thing seems to be almost proof-like, you cannot really argue against it. but clearly, the universe did, to a great degree, at some point in the past. why?

Entropy is just a phenomenon of probability. Don't let the concept of entropy rule your brain and override what's going on. Heat death hasn't happened because entropy can be frozen. You freeze ice then the particles stop moving.

Additionally all of what I said above doesn't apply to things with gravity like black holes. With gravity things begin to automatically self organize into circular orbits or spherical planets. Why? It's because entropy isn't measuring disorder. That's a mistake. Entropy is just saying that systems drift towards high probability macrostates. In systems with gravity, spherical shapes and circular orbits ARE a higher probability macrostate then one where the particles disordered. In this system a higher entropic state is actually MORE ordered then a lower entropic macrostate.

I don't know if there's going to be a "heat" death, but for sure we are moving towards higher entropy as the law says. But this does not necessarily mean more disorder or things getting spread out.

That's it. I think the word entropy just confuses everyone. It's just someone observed these weird phenomenon with heat and called it entropy. Then we realized it's just a bunch of particles moving into high probability patterns.


But that's a semantics game. Sub probability for entropy. Why do we live in a world where low probability states were in the past and high probability ones are in the future? What intrinsic property of the universe causes this asymmetry? One can imagine a symmetric k-negative universe where high probability macrostates trend towards low probability macrostates. Or a k-zero universe where the dice never rolls.

None of such questions follow definitionally from the second law ^H^H^H probability.


Yes. It is a semantics game. I feel people understand probability but they don't understand entropy hence it's easier to just use the term probability state.

And yes the questions you pose don't follow from the 2nd law. But they are the big question.


> Why does probability seem to follow an arrow of time, it doesn't seem symmetrical like the rest of physics.

One cannot really oppose probability to "the rest of physics". Probability is not "a part of" physics. Probability is what we use to describe imperfect knowledge of a physical system - and we know more about the past than we do about the future.


No, probability is a mathematical game with axioms and theorems.

Why the rules of this game happens to describe systems where we have imperfect knowledge... nobody knows.

Another thing is all of these systems have probability travel in a single direction. Things with high probability are more likely to happen. If time were to go backwards, low probability events will start to spontaneously occur.


> Another thing is all of these systems have probability travel in a single direction. Things with high probability are more likely to happen. If time were to go backwards, low probability events will start to spontaneously occur.

If I shoot a pool ball to strike a heavier second ball which is at rest they will end up moving in opposite directions. If "time were to go backwards" - whatever that means - they would approach at the end one would be at rest. That's seems indeed unlikely with "time going forwards" because we wouldn't be able to do that if we tried (at least not systematically).

I don't think there is a conceptual problem or anything surprising there: we know how balls move given their initial conditions but we cannot control those initial conditions with the precision required to obtain a precise outcome.

There are also cases were we can prepare systems in the right configuration and produce "low probability" events even with "time going forwards": https://en.wikipedia.org/wiki/Spin_echo


>If I shoot a pool ball to strike a heavier second ball which is at rest they will end up moving in opposite directions. If "time were to go backwards" - whatever that means - they would approach at the end one would be at rest. That's seems indeed unlikely with "time going forwards" because we wouldn't be able to do that if we tried (at least not systematically).

in a vacuum no ball will ever go to rest if it's moving. It will move forever because there is no resistance.

What's happening in the pool table is that the ball is losing it's movement energy to the table. Table is absorbing it, the air is resisting it and slowly that vibrational energy becomes more and more spread out until it's basically imperceptible heat (which is also technically atoms vibrating).

What happens when time goes backwards is a bunch of tiny low probability events start happening. Heat from the background vibrating atoms by pure random luck happen to align and happen to produce motion that's noticeable. This happens from several places and by pure luck all of this vibrational motion concentrates on one place, the pool table and the ball. The kinetic vibrations just happen to push the ball slowly in one direction more and more with all kinetic vibrations by pure luck speeding up the ball. The ball being picking up speed until you with the tip of the pool cue catch the ball and ease it into a perfect stop, absorbing all the kinetic energy into the stick and your body.

All within the laws of physics but all extremely low probability events.

>There are also cases were we can prepare systems in the right configuration and produce "low probability" events even with "time going forwards": https://en.wikipedia.org/wiki/Spin_echo

When you put an intelligence in the system it's sort of cheating as you can manipulate random events to be non random and thus violate the laws of probability by intelligent choice. There's some computational theory here that states that the act of thinking itself produces entropy thus it's sort of conserved in a way but that' some other theory stuff that's another rabbit hole to dive into.


> In a vacuum no ball will ever go to rest if it's moving.

When a moving ball hits another massive ball that was in its way it’s pretty safe to assume that it was not moving in a vacuum.


>they would approach at the end one would be at rest.

I'm referring to this. This doesn't happen in space. It's on a table on earth.


If in “time going forwards” a ball hits another one at rest and you “reverse time” right after the collision won’t the “time going backwards” get that ball in rest again? (You’re the one who mentioned physics being symmetrical.)

The point is that with just two things interacting with “time going backwards” you “predict” unlikely things to “happen” and we know that it’s because the “initial” conditions are the exact ones that would make such things happen. It doesn’t seem a big mystery.

In the “many, many, many, we-don’t-even-know-how-many” things interacting case we would encounter something similar. The “initial” conditions if we had “time going backwards” are much more unlikely and the “outcome” much more unexpected because in reality we don’t know almost anything about the state of the system. But we know that those “initial” conditions are “special” - it’s not more mysterious than the simpler case.


Ok, so we agree that probability is not a part of physics. I also agree that the question of how to apply probability in physics is interesting.

I’m not sure about the “nobody knows” though. I would say that statistical mechanics has been quite successful in “knowing”: https://arxiv.org/pdf/cond-mat/0501322


Haven't you heard of quantum mechanics. The probability wave makes it a foundational part and not just a macro phenomena.

Statistical mechanics is just a study of the application of probability to that macro phenomena. We still don't know why all of it works.


> Haven't you heard of quantum mechanics.

Haven't you looked at the article I sent which says "quantum" in every other page. Haven't you heard of its author Roger Balian and his (two-volume) book From microphysics to macrophysics: methods and applications of statistical physics.

> Statistical mechanics is just a study of the application of probability to that macro phenomena. We still don't know why all of it works.

"We" may "know" different things - and have a different view on the relative importance of what is known and what it isn't.


A neat little corollary to this is to look a little more closely at what temperature actually is. "Temperature" doesn't appear too often in the main explanation here, but it's all over the "student's explanation". So... what is it?

The most useful definition of temperature at the microscopic scale is probably this one: 1/T = dS / dU, which I've simplified because math notation is hard, and because we're not going to need the full baggage here. (The whole thing with the curly-d's and the proper conditions imposed is around if you want it.) Okay, so what does that mean? (Let's not even think about where I dug it up from.)

It's actually pretty simple: it says that the inverse of temperature is equal to the change in entropy over the change in energy. That means that temperature is measuring how much the entropy changes when we add or remove energy. And now we start to see why temperature is everywhere in these energy-entropy equations: it's the link between them! And we see why two things having the same temperature is so important: no entropy will change if energy flows. Or, in the language of the article, energy would not actually spread out any more if it would flow between objects at the same temperature. So there's no flow!

The whole 1/T bit, aside from being inconvenient to calculate with, also suggests a few opportunities to fuzz-test Nature. What happens at T=0, absolute zero? 1/T blows up, so dS/dU should blow up too. And indeed it does: at absolute zero, any amount of energy will cause a massive increase in entropy. So we're good. What about if T -> infinity, so 1/T -> zero? So any additional energy induces no more entropy? Well, that's real too: you see this in certain highly-constrained solid-state systems (probably among others), when certain bands fill. And you do indeed observe the weird behavior of "infinite temperature" when dS/dU is zero. Can you push further? Yes: dS/dU can go negative in those systems, making them "infinitely hot", so hot they overflow temperature itself and reach "negative temperature" (dS/dU < 0 implies absolute T < 0). Entropy actually decreases when you pump energy into these systems!

These sorts of systems usually involve population inversions (which might, correctly, make you think of lasers). For a 2-band system, the "absolute zero" state would have the lower band full and the upper band empty. Adding energy lifts some atoms to the upper band. When the upper and lower band are equally full, that's maximum entropy: infinite temperature. Add a little more energy and the upper band is now more full than the lower: this is the negative temperature regime. And, finally, when everything's in the upper band, that is the exact opposite of absolute zero: the system can absorb no more energy. Its temperature is maximum. What temperature is that? Well, if you look at how we got here and our governing equation, we started at 0, went through normal temperatures +T, reached +infinity, crossed over to -infinity, went through negative temperatures -T, and finally reached... -0. Minus absolute zero!

(Suck on that, IEEE-754 signed zero critics?)

And all that from our definition of temperature: how much entropy will we get by adding a little energy here?

Thermodynamics: it'll hurt your head even more than IEEE-754 debugging.


I like the following related explanation (https://www.reddit.com/r/thermodynamics/comments/owhkiv/comm...) :

> Many people focus on the statistical definition of entropy and the fact that entropy increases for any spontaneous process. Fewer people are familiar with thinking about entropy as the conjugate thermodynamic variable to temperature. Just as volumes shift to equalize pressure, areas shift to equalize surface tension, and charges shift to equalize voltage, entropy is the "stuff" that shifts to equalize temperature. (Entropy is of course also unique in that it's generated in all four processes.) Entropy is thus in some ways the modern version of the debunked theory of caloric.


> Just as volumes shift to equalize pressure, areas shift to equalize surface tension, and charges shift to equalize voltage, entropy is the "stuff" that shifts to equalize temperature.

I remember watching videos of Leonard Susskind in which he talked about a similar phenomenon where circuit complexity itself increases till it maximizes. It behaves similar to entropy.

Complexity and Gravity - Leonard Susskind

https://youtu.be/6OXdhV5BOcY?t=3046

https://www.quantamagazine.org/in-new-paradox-black-holes-ap...


I like this explanation, but I feel it builds on a good understanding of entropy


If you want an independent definition of temperature without reference to entropy, you might be interested in the Zeroth Law of Thermodynamics (https://en.wikipedia.org/wiki/Zeroth_law_of_thermodynamics).

Here is a intuitive explanation for it from [1]:

“Temperature stems from the observation that if you bring physical objects (and liquids, gases, etc.) in contact with each other, heat (i.e., molecular kinetic energy) can flow between them. You can order all objects such that:

- If Object A is ordered higher than Object B, heat will flow from A to B.

- If Object A is ordered the same as Object B, they are in thermal equilibrium: No heat flows between them.

Now, the position in such an order can be naturally quantified with a number, i.e., you can assign numbers to objects such that:

- If Object A is ordered higher than Object B, i.e., heat will flow from A to B, then the number assigned to A is higher than the number assigned to B.

- If Object A is ordered the same as Object B, i.e., they are in thermal equilibrium, then they will have the same number.

This number is temperature.”

[1] https://physics.stackexchange.com/a/727798/36360


From later in [1]

> Mind that all of this does not impose how we actually scale temperature.

> How we scale temperature comes from practical applications such as thermal expansion being linear with temperature on small scales.

An absolute scale for temperature is determined (up to proportionality) by the maximal efficiency of a heat engine operating between two reservoirs: e = 1 - T2/T1.

This might seem like a practical application, but intellectually, it’s an important abstraction away from the properties of any particular system to a constraint on all possible physical systems. This was an important step on the historical path to a modern conception of entropy and the second law of thermodynamics [2].

[1] https://physics.stackexchange.com/a/727798/36360

[2] https://bayes.wustl.edu/etj/articles/ccarnot.pdf


Yes, but this still allows infinitely many "temperature" scales. I.e. take the current definition of temperature, and apply any nondecreasing function to it.


More intuitively: that TdS has the same "units" as -PdV suggests that temperature [difference] is a "pressure" (thermodynamic potential) that drives entropy increase.


It's also precisely what will show up if you use Lagrange multipliers to maximize entropy given a fixed energy. (though for that to make sense you're no longer looking at a single state, you're optimizing the probability distribution itself)


Yeah I have been ruminating on the strange coincidence in the naming of Lagrange multipliers, Lagrangian, Lagrangian duals..

(See below about my comment on convex conjugates and delimited continuations)


It has the same “units” (if you mean “energy”) as mc2 as well and that doesn’t suggest anything to me… Your intuition is much better than mine - or it’s informed by what you know about temperature.


Sorry! I meant they have the same form as used in the energy differential (1-form), but I had thought "units" would make more sense. In fact, this comparison was how I came to the intuition, although, as you coyly suggested, I did do a check with my earlier intuitions..

https://physics.stackexchange.com/questions/415943/why-does-...


I agree that thermodynamic relations - and Legendre transformations - are fascinating. I don’t think I ever fully understood them though - at least not to the point where they became “intuitive” :-)


Erm sorry again to have implied they were intuitive, all I meant was that it was relatively intuitive --maybe i should have said "retrievable in a high-pressure concept-doodling game" --compared to a wall of text..


No need to apologize! I was joking, I think I get what you mean.


If you let me flash you my (still indecent) state of intuition..

"convex conjugates" (more precisely but limited sense "momentum maps") are delimited continuations in a optimization algorithm.

https://en.wikipedia.org/wiki/Convex_conjugate https://en.wikipedia.org/wiki/Delimited_continuation


Delimited continuations are to exponentials as convex conjugates are to implications?


I'm pretty sure I don't understand the possible meanings of what you said there either so let's try :)

<layman-ish op-research lingo>

I meant that the tangent to the convex conjugate ("momentum") provides bounds on what the values returned by the dual step in a primal-dual algo should be. I don't know which meaning of "exponential" I should focus on here (the action perhaps? A power set? A probability distribution?), but "implications" seem to refer to a constraint on outputs contingent on the inputs so I will go with that. Delimited continuations seem to be the closest thing I found in the PL lit, aka wikipedia, feel free to suggest something less kooky :)

</lol>


That makes much more sense than my flash, which had been following a spark in the other direction:

Delimited continuations are functions, and as such (in a world of algebraic types where we can take sums and products of types) exponentials of types, ran^dom.

[in particular, with the substitution of isomorphism for equality they follow the normal K-12 rules: C^(A+B) ~= C^A * C^B, etc.]

I'd just been glancing at https://en.wikipedia.org/wiki/Convex_conjugate#Examples and the pattern by which single-branched f(x) seems to often become a multibranch f(x) reminded me of how logic-reversing functions in general and logical implication in particular "adds branching": if we wish to establish x'<=5 then if x is already <= 5 we may `skip` but otherwise we must calculate x-5 (and then subtract it off); similarly an implication x->y may be satisfied on one branch by not x but on the other requires y.

[and on the general topic: I like to think of temperature as tying together energy and entropy, where positive temperatures yield the familiar relationships but negative temperatures "unintuitive" ones]


:)

I'm guessing you might have wanted to sloganize that as: "DC's are algebraic" https://cstheory.stackexchange.com/questions/42431/how-to-te...

This reverse flash might be what could motivate me to make the connection useful.. an exercise in geometric vengeance (and intuition building) for me to use backtracking DCs in optimization problems (engineering => SDP/IPs)? now to find a plug-and-chug example..

A bunch of timely puns here, including "thermometer continuations" https://arxiv.org/pdf/1710.10385

Besides negative T occuring in situations where the arrow of t appears reversed..., PG13 "exponentials turn products into sums"

There is also the pun where S stands for both "action" and "entropy" so that's another direction in which to hunt for the Lagrange multiplier/Lagrangian-Hamiltonian connecting unicorn e.g. picking the "most representative", not necessarily the most optimal path.


> picking the "most representative", not necessarily the most optimal path.

My impression was that stationary/coherent action picks out inflection points, which happen to be optimal in many cases, but are not necessarily?


I don't know if this counts as an indecent flash, because.. well it is a half-formed opinion (malformed intuition?) born of recent experience..

It's hard to describe this personal experience succinctly, nevertheless I can relate it to wizened physicists often marvelling at having terms miraculously cancel when they engage in the voudou popularly known as path integration


Does the temperature actually change discontinuously in a physical system from -infty to +infty, or is it a theoretical artifact that does not show up experimentally?


Depending on what you mean by “discontinuously” it always does: the microscopic world is “discrete”.

Instead of thinking of “temperature” you may think of “inverse of temperature” and then there is no issue with that number going “continously” from very negative to very positive.


Is the special case of 1/T = 0 also known as the Big Bang?


Interesting - you're a great writer!


There is also an interesting relation between the second law of thermodynamics and the cosmological principle (which says "the distribution of matter is homogeneous and isotropic on large scales"):

The second law of thermodynamics says that the universe has an entropy gradient in the time dimension, while the cosmological principle says that the universe has no matter gradient in the spatial dimensions.

So together they describe how the universe (space-time) is structured, i.e. on the temporal dimension and the spatial dimensions.

It's also noteworthy that one enjoys the honorific "law" while the other is merely called a "principle". I wonder whether this is just an historical artifact or whether there is some theoretical justification for this distinction. (My intuition is that both are more "principles" [approximate tendencies?] than fundamental laws, since they don't say what's possible/impossible but rather what's statistically likely/unlikely.)


>merely called a "principle"

Merely a principle? In science principles are what mathematicians call Axioms. Not proven but taken as true because you have to start somewhere, and it is the only thing that makes sense.

The cosmological principle is the philosophical position that physics works the same everywhere. We haven't done physics experiments across the universe, so we can't call it a law because there is not enough experimental evidence.


You are confusing the cosmological principle with the uniformitarian principle [1]. The cosmological principle concerns the large-scale distribution of matter (not of laws) in the universe, and that is something we arrived at empirically, through observations with large telescopes (galaxies are approximately equally distributed in all directions) and the measurement of the cosmic microwave background. It doesn't get more empirical than that. It's not something like an axiom at all, just an approximate generalization.

> We haven't done physics experiments across the universe, so we can't call it a law because there is not enough experimental evidence.

That's equally true for the second law of thermodynamics: We haven't done physics experiments in the distant past or in the future direction, so we strictly speaking can't be certain that entropy doesn't decrease in the future. But no law is perfectly confirmed anyway (not tested in every conceivable circumstance), so that can't be a criterion for lawhood anyway.

But I already proposed a better criterion: A (fundamental) law says what is possible and impossible, and neither the cosmological principle nor the second law ("law") of thermodynamics do that.

[1] https://en.wikipedia.org/wiki/Uniformitarianism


Considering I quoted the Wikipedia page on the Cosmological principle, I disagree but IDK maybe Wikipedia is wrong.

https://en.wikipedia.org/wiki/Cosmological_principle

As far as I am aware Uniformitarianism is basically the same thing but covers a bit more since it includes "natural processes" like the results of erosion and is used by Earth science types. Cosmological principle is used at the cosmological scale so they don't bother including planetary natural processes in their axioms.


The classic Flanders and Swann explanation: https://www.youtube.com/watch?v=VnbiVw_1FNs

Excerpts: No one can consider themsleves educated who doesn't understand the basic language of science - Boyle's law: the greater the external pressure the greater the volume of hot air. I was someone shocked to learn my partner not only doesn't understand the 2nd law of thermodynamics, he doesn't even understand the first!

: Heat won't pass from the cooler to hotter! You can try it if you like but you'd far better notta!

M: Heat is work and work's a curse M: And all the heat in the universe M: Is gonna cool down, M: 'Cos it can't increase M: Then there'll be no more work M: And there'll be perfect peace D: Really? M: Yeah, that's entropy, Man.


Thank you for posting this! I'd never heard it, and it's great.


thanks, good one


I could never wrap my head around the abstract concepts used in these explanations because they don't connect to what is actually happening at the atomic level. As far as I could tell the actual particles are undergoing a constant process of reducing the potential energy induced by force fields between them, which means everything is just jiggling all the time and spreading further and further apart. Heat is just some metric describing the aggregate behavior.


People overcomplicate the matter. The second law of thermodynamics is fundamentally very simple: things flow towards (or seek) a state of equilibrium. That's it. Entropy is just a measure of equilibrium. All the concepts are like you said, just some behavior exhibiting that.


> the actual particles are undergoing a constant process of reducing the potential energy induced by force fields between them

Not really. They are in the process of spreading that energy as equally possible through as many fields as they can.

What is the Second Law of Thermodynamics.


This all applies at the quantum level. Ask your quantum computing friends why we don't have quantum computers yet.


2nd law only states a direction, however, does not determine the rate of change of things. It is also related to the spontaneity of reactions. What is the role of activation energy (or other weak/strong nuclear force potential barriers due to state).

What prevents everything happening all at once (just by obeying 2nd law is there a reason?). And if there is, is there a consistent formulation of 2nd law + other law that get this problem, at least macroscopically correct?


Repeal the Second Law! Free Energy for Everyone!


Is anyone else hearing the word thermodynamics pronounced by Homer J?

It's tinting my ability to read this.



[flagged]


No this is an actual explanation and exploration of the consequences of the tendency for energy to spread out




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: