> So it is very likely for a tornado to reconfigure things into debris and extremely extremely unlikely for the tornado to reconfigure everything into a brand new car.
Is this not a linguistic sleight of hand? There are billions of trillions of states we label with "debris" but only a few thousand we would call "car". So a specific state of debris, then, is equal in complexity to a car?
The technical term for this is macrostate. It is not a linguistic sleight of hand.
It is literally within the formal definition entropy. Debris is a high entropy macrostate, while a car occupies a lower entropy macrostate. There are less possible atomic configurations for cars then their is for debris. Each of these individual configurations of atoms is called a microstate.
A macrostate is a collection of microstates that you define. Depending on the definition you choose that definition has an associated probability. So if you choose to define macrostate as a car, you are choosing a collection of microstates that have a low probability of occuring.
The law of thermodynamics says that systems will, over time, will gain entropy meaning they naturally progress to high probability macrostates over time. So in other words, complexity is destroyed over time by specific laws of entropy.
The reason why this occurs is straightforward. As a system evolves and randomly jiggles over time it trends towards high probability configurations like "debris" because simply that configuration has a high probability of occuring. Generally the more billions of microstates a macrostate contains the higher entropy that macrostate is.
Through logic and probability and the second law of thermodynamics we have a formal definition of complexity and we see that complexity naturally destroys itself or degrades with time.
This is the thing that confuses people about entropy. It's definition is a generality based on microstates and macrostates you choose to define yourself. It's similar to calling a function with generics in programming where you choose to define the generic at the time of the call.
But even within this generic world their are laws (traits in rust or interfaces in c++) that tell us how that generic should behave. Just like how entropy tells us that macrostates we define will always trend towards losing complexity.
The heat death of the universe is the predicted end of the universe where all complexity is inevitably lost forever. You can define that macrostate as the collection of all microstates that do not contain any form of organization.
You're mistaken and you're intuition is off. It lines up absolutely.
Debris is almost any configuration of atoms that are considered trash or unusable.
There's are much more ways you can configure atoms to form useless trash then you can to make cars. Case in point "you" can manufacture "debris" or "trash" by throwing something into a trash can. Simple.
When's the last time you manufactured a car? Never.
Is this not a linguistic sleight of hand? There are billions of trillions of states we label with "debris" but only a few thousand we would call "car". So a specific state of debris, then, is equal in complexity to a car?