I'd prefer to skip metric and redefine the foot as the distance light travels in a nanosecond ~ 11.8 inches. The length of the path travelled by light in vacuum during a time interval of 1/299792458 of a second seems a bit arbitrary to me.
I would also vote for any candidate that would ban anything but powers of 2 in the definition of computer storage.
I don't get the fascination with powers of two in storage. RAM I understand, because you're counting address lines. But storage? What sense does that make?
I have an SD card with 16.0 GB of space available, and I'm recording video at 9.00 Mbit/s. How many hours of footage? Well, (16.0e3 MB) * (8 bit/B) / (9.00 Mbit/s) = 1.42e4 s, or 3.95 hours.
Now do the computation with binary units.
I have an SD card with 14.9 GiB of space available, and I'm recording video at 9.00 Mbit/s. How many hours of footage? Well, (14.9 GiB) * (1e-6 * 1024^3 MB / GiB) * (8 bit/B) / (9.00 Mbit/s) = ...
Why make people do extra math? Shouldn't we choose units that make things easier to calculate, not harder?
If its 200Mbytes in RAM, it should be the the same on a disk. Also, a sector is 512bytes of 4096bytes, so the labeling it in powers of 10 actually makes the math worse.
Why make people do extra math? Shouldn't we choose units that make things easier to calculate, not harder?
> Also, a sector is 512bytes of 4096bytes, so the labeling it in powers of 10 actually makes the math worse.
Sector sizes are different between drives anyway. So using sector as a label is just bad practice.
And... how does it make the math worse? My hard drive partition has a size of 250140434432 bytes (actual value)... how much is that in GB? Easy, 250.1 GB. For some reason, I was able to do that calculation without the aid of a computer or a calculator.
Humans don't count sectors. We like to shift decimal points around. Computers are good at calculation, so we give them the task of multiplying by 512 for us, and displaying measurements in units suitable for human society.
> Sector sizes are different between drives anyway.
Yeah, 512 bytes or, now, 4096 bytes - neither is a power of 10
The math is worse because the computer (address space) and hard drive are actually base 2 and vendors are selling in base 10.
This worship of base 10 in every aspect of our lives, even when it doesn't make sense and cheats us out of money, is sad. Computers are base 2, memory is base 2, and storage should be base 2 or else its just a batch of lies. I guess I should be glad the humanity didn't have 11 fingers.
The computer is showing you a number, calculating using base 2 will not be any stress on a human since it doesn't show every byte anyway.
All in the service of 10 worshipers. There is a tribe in the US (Yuki) who got it right and counts in octal because they count the spaces in between the fingers (you can hold 8 bottles).
I have tried to convince people to change the metric system to binary, i.e. have a kilogram be 1024 grams, and a kilometer be 1024 meters. Which would finally force hard drive manufacturers to be honest about their devices' capacities.
So far, I have not been very successful, though. :(
It's not arbitrary at all, it's for backwards compatibility. Previously the meter was defined in terms of the emission lines of krypton-86, which was a distance that we could measure most precisely with an interferometer. Before that it was based on the size of the earth, which could be measured with a astronomical calculations. And originally it was based on the length of a pendulum that produced a period of a 1 seconds, from which the length can be calculated using a scale, a reference mass and a clock (there was another definition, but this one was more precise I believe).
Each definition gives you something pretty close to what we now define as a meter, but the precision to which they could be measured at the time differed. The length of a "meter" was more or less the same distance in everybody's mind then as now, but if a 10000 scientists would sit down and perform experiments to actually calculate the exact length in 1700 versus, the mean of the values they reported would be about the same as it is today, but the statistical uncertainty would be much much much higher.
The redefining of the meter has typically occurred when some new process was invented that was more precise than the the previous method, for instance today measuring a laser in a vacuum is something like 1/3 the uncertainty than the old method using an interferometer. So the uncertainty is smaller, but crucially the mean value is still basically the same or very close. That's the reason for the weird 1/299792458 seconds, it's because we can define seconds very very precisely (from atomic clocks) and that's the amount of time it takes light in a vacuum to travel the same distance as the previous most precise known value for the length of a meter.
If it wasn't done this way, every time we invented a more precise method to measure distance and want to improve the precision of the meter it'd be like defining a whole new unit. Defining a foot as the distance light travels in a nanosecond is fine now, but if we discover an even more precise way to measure distance than a laser in a vacuum we'd end up in the same position, where a "foot" would be some strange fraction of a reference value that makes it work out to agree with the old most precise known value.
This sort of stupidly accurate measurement doesn't matter anymore on the day-to-day life scale as the length of a meter is known to within ~(10^-9)%, which is about a ten picometers. However, that means then that if you're fabricating silicon at the nanometer scale, the actual exact length of a nanometer is only known to within about 1% of a nanometer (if I did my math right, might be off by a factor of 10). That's much more significant.
"It's not arbitrary at all, it's for backwards compatibility."
The original definition of the meter was one ten-millionth of the distance from the equator to the North Pole. Sure, we got more precise, but its still arbitrary.
The only interesting thing about metric is the relation of length, volume, mass. But, a liter is not a cubic meter, nope - its a cubic decimetre, another arbitrary decision.
Indeed, what units are convenient depends on context. Metric/SI is set up so that most of the un-prefixed units are a convenient size on human scale. But doing this means that some derived units will have values that are not human scale (like the Pascal, atmospheric pressure is ~100000 pascal). The nice thing about SI is that they subdivide easily in powers of ten, so even if the Pascal is inconvenient in our day-to-day lives it's easy to talk in kPa as 1000 Pa.
This is contrasted with something like Imperial units, where every division of e.g. distance is supposed to be roughly based on some physical object. That's why you end up with 12 in/ft, 5280 ft/mi etc. Or alternatively you end up with metric-imperial hybrid units like the kilopound.
Not to mention, what units are convenient vary depending on what you do. For instance one of the SI alternative units for energy is the electron volt (eV), the work done to move an electron through a 1 volt potential. This is a tiny amount of energy on human scale- a common analogy is that 1 MeV (10^6 eV) is enough energy to make a single grain of sand twitch a little bit. But, if you're a nuclear physicist (or maybe a chemist) then eV are typically much more convenient than say Joules.
I'd argue that the unit liter was chosen as a nice power of 10 relation to a cubic meter (the obvious initial choice) as being the best for 'every day' common person transactions.
Take a bottle of water, most commonly the sold sizes are 0.5 (about 16oz) and 1 liter bottles; a liter is also pretty close to a quart.
As another example, it is common to find soda pop sold in bottles of 1, 2, and 3 liters (depending on the brand).
Yes, the original definition is arbitrary of course. I was talking about the random constant fraction of a second. Units are always arbitrary, by definition they're just a convention. They're a way of relating an abstract concept (a number) to some intuitively meaningful thing.
No, because defined this way, the absolute precision to which distance can be defined is ~10^-11 meters. So 1 meter is known within +/- (10^-11) meters. If you convert (10^-11) meters to nanometers that's 10^-2 nanometers, which is 1% of a nanometer.
You don't divide the uncertainty when you scale it because the error is in the actual definition of distance itself. The uncertainty in for example 1 meter is the same as the uncertainty in 1 nanometer (=10^-11 meters), which is a correspondingly much larger fraction. It's kind of weird and perhaps counter-intuitive, but that's how it works.
Phrased differently, the idea of a meter is exact and it's our ability to measure distance that is uncertain. The error isn't in saying "a meter is some specific fraction of the distance light travels in a second", it's in determining what physical distance in the world is represented by our definition of the meter.
Ah yes, this gets at an interesting side note, which is the difference between epistemic and aleatory uncertainty. QM uncertainty is aleatory, while issues of measurement precision are epistemic. The distinction being that epistemic errors can be reduced with better tools and with an ideal measurement device are eliminated, whereas aleatory uncertainties are always present even with a perfect tool.
Assuming QM's predictions are valid, we will never ever be able to to improve our measurement beyond the point where errors from the uncertainty principle dominate. Our current definition of the meter is pretty close to this limit, so personally I don't expect the meter is going to be redefined any time soon because we're already near the sort of scales where the idea of "distance" starts to get kind of fuzzy.
I'll take one level over two levels. Besides, memorizing that metric fraction is just wrong. Plus, do we really want to go into space with a unit of measure whose origins are tied to the size of the Earth? Let's at least free part of the definition from an Earth-centric bias. I fear a Mars-centric meter might develop.
Unlikely. The metric units have a long history of never having been redefined in a way that breaks previous usage. As far as I'm aware, all redefinitions have just improved the precision, so they don't hurt previous users. Your proposal would make all the old feet wrong and anybody seeing "foot" would have to think about which type of foot it might me.
Oh, I don't know, I bet the future Mars independence movement will try to throw off all the shackles of Earth and switch their units of measure. I jest just a bit, but its not like we humans don't pull stunts like this all the time in the spirit of independence.
My proposal is basically the engineering estimate of foot. I would probably name it something different if I were Emperor / Very Powerful Politician. Its small enough to derive the mass and volume units directly.
I would also vote for any candidate that would ban anything but powers of 2 in the definition of computer storage.