12 is divisible by 2, 3, 4, 6
24 is divisible by 2, 3, 4, 6, 8, 12
60 is divisible by 2, 3, 4, 5, 6, 10, 12, 15, 20, 30
Compare that to
10 (divisible by 2, 5) and
100 (divisible by 2, 4, 5, 10, 20, 25, 50),
and you start to see the problem. You can't split a 10 hour day into quarters without a decimal, and you can't split a 10 hour day into thirds without an infinite decimal or a fraction.
But in 1793, the French smashed the old clock in favor of French Revolutionary Time: a 10-hour day, with 100 minutes per hour, and 100 seconds per minute. This thoroughly modern system had a few practical benefits, chief among them being a simplified way to do time-related math...
isn't really true. Sub-dividing days and hours into smaller, equal pieces is a critical part of time-keeping, and it really isn't simpler with 10 and 100.
Keep in mind they introduced the entire metric system: distance, volume, weight, currency, etcetera, making this just one of many changes... Either way, apparently people weren't too stoked about having one day off every 10 days instead of one day off every 7 days.
You have a good point about taking it in context with the 'decimalization' of other units of measure, but my point is really just that the article is overly generous towards decimal time, and ignores solid mathematical reasons for using 12/24/60 for time-keeping. I've never in my life needed to know when a day is "70% over", but I constantly need to subdivide my time into halves, thirds, or quarters.
You do realise that you use these approximations because they're the integral results of easy calculations, and you'd just use slightly different durations in a decimal time framework, right?
As in, metric countries don't use a 12.7mm wrench because that's what 1/2" is. They use a 13mm wrench (and the corresponding metric nuts). And their speed limits are 110/120/130, not 112.6/120.7/128.7.
For an alternative let me recommend just using binary logarithms written in base 12. (So that 1 corresponds to doubling, 1.7 corresponds pretty closely to tripling, and 2.4 corresponds roughly to quintupling. i.e. 0.1 (base 12) is an equal-tempered semitone.
 In fact the reason we don't use decimals for time is probably precisely because it is not using a metric system. What does 1.3h means? 1h and 18 minutes or 1h and 30 minutes?
Only if you ignore the complexity of teaching a whole population to do arithmetic in a different base.
Of course why the Babylonians went with this two-mechanism system instead of doing the same thing on both hands is probably because of divisibility. Get a prime factor of 5 from one hand, and prime factors of 2 and 3 from the other.
(Also note that you can count 12 hours on one hand, a full 24 on both.)
EDIT: AFAIK the Babylonians didn't have a developed number theory, and I'm not sure they even had a formal idea of prime numbers. But we do know from surviving accounting "textbooks" (practice tablets & instructions) that both they and the near-contemporary ancient Egyptians understood the special divisibility of these numbers.
(Pro tip: different cultures have done things differently across time and space!)
I believe that counting method was devised by some fan of duodecimal arithmetic sometime within the past century, and has nothing to do with sexagesimal numeration per se.
The evolution here was: physical clay counters; physical clay counters sealed in a clay envelope; physical clay counters sealed in a clay envelope but also pressed into the outside of the envelope to indicate how many; clay tablet with counters pressed into the outside (since the envelopes with counters inside were redundant); clay tablets with little cuneiform symbols to represent quantities, differing by type of object being counted, and not all sexagesimal; more uniform written sexagesimal writing system.
There is no indication in the symbols about how people counted on their fingers.
But well, the decimal system is probably the second best...
My main gripe these days is the lack of consistency : while hours/minutes/seconds are not too bad, having computer screens in inches annoys me a lot ! (While TV screens are in centimeters, as expected.)
This is not my experience in America. We have both computer monitors and TV screens in diagonal inches
I've been using centidays (and dimidays) for a few years now and it works well for me as a personal time keeping system in combination with local time (at local noon the sun is at its highest point in the sky and the clock shows 50:00) and a lunisolar calendar. More on that here: https://geodate.org
A centiday is precisely 864 seconds, and 4 centidays is almost equivalent to 1 hour (3456 seconds) so it's not too hard to make rough conversions.
I made a little Fitbit app so my watch could display the time (with the longitude given by the GPS of my phone) and I have a sundial in the garden that works just as well. It's quite fun to play with time keeping systems.
- the .beat (a 1/1000th of a day) is equal to 1m26.4s, so it's enough to point at various points in time, rather than have 2 numbers
- if more precision is needed, "centibeats" (1/100th of a .beat) exist, and are very close to a second so it's easier to grasp (0.864s)
- taking a 1/1000th of a unit is the standard way of going up and down in units, rather than going 1/100th twice
- I don't know if the French decimal time has ever been adopted beyond amateurs, but internet time has been (for a very short time at least)
However, actual days are nowhere close to a fixed amount of time. And for historical reasons we are stuck with a very precisely defined second that is based on 24 hours in a day.
So the obvious approach would be to define a new unit of time that is roughly equivalent to a milli- or microday.
But the chance of that happening is close to zero.
So we are stuck with the second as the fundamental unit of time. Having a 'centiday' as 864 seconds would be even more removed from the principles of the metric system than a 3600 second hour.
It's interesting because it particularly exposes interesting flaws in using metric prefixes (that are sometimes also talked about in the world of grams where kilograms are the closer unit to most "human applicability" and that a more human scale would possibly have more prefixes near kilogram than gram). Particularly the magnitude jump from a kilosecond (16.7 minutes, about a quarter hour) to megasecond (11.6 days, nearly a fortnight).
I've heard fun proposals that if Metric Time based on seconds were to be more useful on human timescales to bring back the deceased Metric prefix myria- . A myriasecond is 167 minutes or just less than three hours (2.78). Maybe a prefix for the next digit as well (10^5) which would be just more than a day, by about an eighth of a day (27.8 hours).
But we don't have megagram, megameter because we not used to it.
Just like you can easily talk about 10 km, 20 km, etc. you could easily talk about 10 ks, 20 ks, etc. no need for a special name. We are just not used to it.
milli-, centi-, deci-, deca-, hecto-, kilo- are all six very close surrounding the 0-point/origin. Everything else gets logarithmically further out.
Using second as the 0-point is great for computation, as milliseconds and even microseconds can matter to a computer, but it isn't great for the human perception of time where people are bad at even granulizing events that happen in deciseconds, leaving nearly all of the decimal prefixes almost useless in human timescales.
Which is why you see the French Decimal Time system mentioned in the article here and used by the poster above in this thread centered around the day as its 0-point. milliday, centiday, deciday, decaday, hectoday, and even kiloday are all relatively useful units in a human timescale.
While you are correct that we don't "need" that kind of vocabulary, the argument is that it would be useful in conversation and in day-to-day (fig. and lit.) math. Having additional prefixes serves as mental anchors/frames for the conversation. If I give a number of seconds to you as hundreds of milliseconds versus deciseconds, that gives you additional context about my methodology and/or focus in whatever that benchmark was. Sure, you can easily convert between the two prefixes, but that context can be useful.
Just as the difference between if I tell you something is happening in hundreds of minutes from now versus threes of hours from now. There's probably a useful contextual distinction there. That's why people play with solutions like Decimal Time changing the scale altogether away from seconds to something more like an Earth day, or in the other direction explore ideas with more prefixes to fill in some of the logarithmic "gaps" such as myria- and hebdo-.
The other thing worth point out is that, though we have centi-, deci-, deca-, and hecto- they get only limited use in practice.
Very few distances are specified in deca- or hectometers.
Very few things are decagrams. There is a metric ounce and pound. But hectogram is not used.
So for distance, we have meter, kilometer. For mass, gram, kilogram, metric tonne.
Within the same system, we could easily deal with kiloseconds and megaseconds.
As far as I know, nobdy uses deciseconds as something other than a weird way of saying 100 milliseconds.
In informal speech, the number of zeros is used as estimate of precision. Using prefixes to specify precision is way too confusing.
Within the metric system, 10 kilosecond would roughly be the same as 3 hours. With a single day, everybody could easily adapt to kiloseconds. The problem starts when you have to create a system of timekeeping based on metric second.
Not really. Games like "one Mississippi" are fun approximations, but they aren't terribly accurate. Also this gets very close to the argument that foot is a better unit for length than meter because you can more easily approximate it with the average adult male's shoe length. Easy to approximate has its uses, but also doesn't necessarily make it the best fit for every application.
Scheduling is a critical component to people needing time, and yes is just about impossible without clocks / calendars / sundials / other "external sources". But that's also where all the interesting stuff happens when people use time and if you can't account for the math of scheduling, people would never use that unit for time.
Part of the problem there is that we do keep trying to "square the circle" if we want to apply metric tendencies to our reference points on rotating spheroids. (Another fun experiment I've seen in sci-fi, and played with myself, is the idea of day time expressed in radians.)
Thanks to our computers that keep track of every seconds elapsed since Unix Epoch, I can use them when I want to convert from my personal system to any other system or vice versa:
$ geodate 51.1789 -1.8262 $(date +%s --date "12:00 +0100") --format "%c:%b"
Usually, when there are posts about decimal timekeeping on HN, it falls to me to remind everyone about Swatch Internet Time, so i'm overjoyed that this time, you've brought it up!