To connect them, the entropy of a system (such as an ideal gas) can be computed as the logarithm of the number of equally probable microstates in your probability distribution for that system.
(You need to define a sensible measure, because the state variables are continuous, but that's not important).
You can express this in terms of the message length or number of bits of information you would need to gain about that container of ideal gas in order to narrow it down to a particular microstate.
It's no different than the entropy of, say, a password.
> There is no way to reduce the entropy of a totally random system by knowing more about it
There absolutely is! For example, Maxwell's demon. Maxwell's demon will fail to reduce system entropy if it doesn't know the microstate to begin with, but otherwise it can use its knowledge to reduce the entropy of a gas by separating its parts.
> unless your hypothesis is that true randomness doesn't exist in the universe
I do also subscribe to this hypothesis, that like entropy, randomness also only exists in our heads as a reflection of our lack of information about a system, and isn't a property of any system itself. But I don't think I need to rely on this assumption.
With the entropy of a password, abcdefg has less entropy than kfbeksn because it is more likely to occur. Further, the calculation of entropy is based on an alphabet of symbols, shared by sender and receiver.
I don't see how that connects to the ideal gas example. Care to share?
I'm on mobile so it's hard to write a lot of detail. But here's probably the most straightforward way to clarify it.
Gasses are the same in principle but harder to visualize concretely, so let's think instead of a crystal lattice with impurities -- for example a doped silicon wafer.
I hand you a 28 gram disc of 0.1% n-doped silicon. It's got about 6e23 atoms in it, of which 0.1% are phosphorus.
That's a good statistical description of the crystal, but there are still a large number of possible configurations of phosphorus and silicon atoms that match that specification, but only one of them describes what you are holding in your hand. For whatever reason, you want to know the exact atomic arrangement of this particular wafer.
Fortunately, I've imaged this disc with an advanced electron microscope that has identified the identity of each atom in the crystal lattice, whether silicon or phosphorus.
How large is the data file that I have to send you? How much can I compress my electron microscope data?
Just like "abcdefg is less likely than kfbeksn", we have some prior information we can use to do compression. For example, dopant atoms are unlikely to cluster together, so Si-Si-P-P-P-Si-Si is less likely than P-Si-Si-P-Si-Si-P. That prior information is no harder to incorporate into the entropy calculation than it is in the case of passwords.
But such subtleties aside, we can start by assuming that any configuration is equally likely, compute the total number of configurations, and take the log base 2. That's going to be the minimum file size of the electron microscope scan. (I guesstimated 6e12 Tb but don't quote me on that). To do a proper job, we'd also need to measure and transmit the state vector of the lattice vibrations, but I'm ignoring that for now.
Ok, so that's for a crystal, which I chose because the electron microscope technology is easy to concretely visualize, and the state can be expressed as a string of Si's and P's. But an ideal gas is similar in principle, except its state variable consists of a string of positions and velocity values. We will have to discretize to some resolution to get discrete states, but that's fine. There is some datafile I could send you that would (quantum uncertainty aside) let you know the exact state of the gas, and with fine manipulators you could use that data to sort the gas molecules into different containers by molecular energy and reduce its entropy to zero. The entropy of the data you'd need to do that would be at least as high as the entropy of the gas, and upon clearing that data from your memory, the second law is satisfied.
(You need to define a sensible measure, because the state variables are continuous, but that's not important).
You can express this in terms of the message length or number of bits of information you would need to gain about that container of ideal gas in order to narrow it down to a particular microstate.
It's no different than the entropy of, say, a password.
> There is no way to reduce the entropy of a totally random system by knowing more about it
There absolutely is! For example, Maxwell's demon. Maxwell's demon will fail to reduce system entropy if it doesn't know the microstate to begin with, but otherwise it can use its knowledge to reduce the entropy of a gas by separating its parts.
> unless your hypothesis is that true randomness doesn't exist in the universe
I do also subscribe to this hypothesis, that like entropy, randomness also only exists in our heads as a reflection of our lack of information about a system, and isn't a property of any system itself. But I don't think I need to rely on this assumption.