I've always felt the whole binary/digital thing was one of the most clever bits of compromise with the real world I've ever seen.
You have this thing, you want to be able to translate its value into something useful. In this case, the amount of voltage in a circuit to a number. And you spend so much time trying to make sure the voltage level passed is rock solid, that your read is equally solid, etc. Until you realize that you'd have to invent so many more industries just to do this one thing that you just give up and say the only thing you can know with certainty is that there is or is not voltage passing through the circuit.
Then you need to be able to translate "ON" and "OFF" into actual usable values. And eventually coming down to a base 2 counting system so that 4 circuits gives you 16 distinct values seems obvious in hindsight, but had to be a revelation when they realized it.
Bingo, precisely. I keep using this compromise as a foundational example when I talk to my wife (currently doing a CS degree) about why binary stuff, specifically, keeps coming up in her stuff. 2 is a magic number in computing because it is the Great Compromise we made with reality to get it.
Every now and then people here bring up more exotic paradigms, like ternary computing, and yes! In an ideal world the more phases you can detect the 'better', all the way up to infinity (pure analogue computing). But the difficulty curve to scale anything besides base 2 to where base 2 computing currently is is, at least for our current understanding in physics, materials science, etc., way higher.
Actually, Leibniz came up with the dual/binary system as being ideal for computation 300 years ago roughly around the same time he built his mechanical calculator, long before there was any chance of there being programmable computers.
You have this thing, you want to be able to translate its value into something useful. In this case, the amount of voltage in a circuit to a number. And you spend so much time trying to make sure the voltage level passed is rock solid, that your read is equally solid, etc. Until you realize that you'd have to invent so many more industries just to do this one thing that you just give up and say the only thing you can know with certainty is that there is or is not voltage passing through the circuit.
Then you need to be able to translate "ON" and "OFF" into actual usable values. And eventually coming down to a base 2 counting system so that 4 circuits gives you 16 distinct values seems obvious in hindsight, but had to be a revelation when they realized it.