You have been lied to… your true enemies are the ones that continue to build monotonic systems on top of non monotonic time scales… Join us in the Temps Atomic International! TAI is the one true timescale.
Computers should be using TAI and converting to UTC when humans want to see a local human time. My sincere belief is this was a bigger mistake than null, bigger than all the Y2K “we stored the year as 2 digits” stuff… The fact that from the BIOS/system Firmware up “we” decided to use a fuzzy time scale that changes on an unpredictable but just infrequent enough almost everyone can ignore it even exists except for when it changes and they remember it exists… genuinely a poor choice that I suspect is in large part due to a lot of people not knowing about TAI
The GPS-only clock source entered the room… I.e. most aircraft, spacecraft, electrical infrastructure, cell towers, etc. Welcome straight back to “what do we do with the leap second?” hell.
Unfortunately none of the interfaces between parties agree on TAI. It’s stupid GPS time or UTC.
GPS never had leap seconds. Your hell is imaginary. It is a myth and a lie perpetrated by those that never looked beyond the convenient lie to children that is UTC and tried to understand the true nature of timekeeping.
Actually it’s a bit more complicated than that. As well as the integer number of seconds between GPS time and UTC, the GPS signal includes a small linear correction (rate and phase) that gives you a more precise version of UTC(USNO). So to get something like TAI from GPS time you must also apply the rate and phase correction. But that still isn’t TAI as such, because TAI is only defined after the fact, in the form of retrospective corrections for each time lab’s implementation of UTC. So you would need to get Circular T from the BIPM and apply the USNO correction.
I mean look... we can start getting really fun with it and get into the metrology of it all, but I'm having some fun and trying to spread the good word here.
Early programmers totally fucked up using UTC instead of TAI which as a monotonic count of seconds is a far more accurate representation of how computers keep track of time via continuous oscillation counters than any approximation of UTC until we developed widespread network synchronized NTP based systems and began to use UTC via these, and even then its not a really good representation because all these systems want to work with clean contiguous "one second is one second" time keeping, which is what monotonic time scales are, they inherently are this. So the choice of using a non-monotonic time scale on top of hardware that keeps time monotonically was flawed from the start.
We can get into the Sidereal time, Ephemeris time, the International Earth Rotation and Reference Systems Service, UT, TT, ERA, DUT1, MTC (Marian Co-ordinated Time), and what the difference is between an instance and an interval that represents a specific moment in time with an any temporal uncertainty attached... I do freely admit I'm a total unrepentant geek about this sort of thing, the only reason I don't own an atomic clock is because I just cant justify the few grand to buy one (cheapest I've seen is $1.5kAUD for an DIY module from Microsemi) when I know how precise my GPS based Stratum 0 embedded Linux NTP server is... Heck the IERS Bulletin emails are some of the only automated emails I look forward to and open up with a little glee, finding out what's up with the literal rotation of the planet, neatly dropped in my inbox!
But as nerds who already know a bunch of metrology/timekeeping stuff (which I am assuming from your comment) we could launch ourselves down the rabbit hole, and hope the readers can keep up, or as I've tried to with my comments here (particularly with the slightly whimsical choice of wordplay)... Try and give causal nerds who have only just tripped over the very idea that there is a better timescale that computers should be using instead of UTC some idea where to begin reading, and to wash away some of the false truths (like that GPS has leap seconds) and while technically TAI might not be easy for computers to use due to the retroactively corrected nature of it... at the end of the day all timekeeping is subject to measurement error, approximation, and correction either due to or in correlation with external sources, and thus because we only periodically correct computers to UTC mostly (on average) between 1hr to 1day via NTP depending on OS and preferences,
correcting a monotonic system clock to the daily correlated and corrected TAI would be infinitely better due to eliminating all the leap second complaints, than repeatedly correcting the system clock to UTC and continuing to have the leap second problem, because UTC has a place in the timekeeping world a place that its miss-use by computers has now put in jeopardy (the calls driven by tech companies to fix UTC and have no more leap seconds).
Prior to the invention of NTP, system administrators would have just corrected it to TAI manually, and done so just as accurately as we would have corrected it to UTC, so arguments like "it would be harder to use TAI than UTC" have honestly never been true, its really just a problem of awareness and now that we're in a world where system firmware is broadly speaking designed to keep time via UTC and thus when you interrogate "system time" you get UTC, everything built up from this layer is "wrong" so its a bootstrapping problem on the level of switching to IPv6, yes its technically better, but none of the hardware supports it and so why bother when I can just keep hacking away at UTC/IPv4... except there's no UTC scarcity to drive any sort of adoption, so all that can be done is to proselytize the benefits of TAI to the best of ones ability and hope that eventually it makes a difference.
I should have been more clear. GPS is the source, but the APIs that these systems use with the ground are usually UTC due to the nature of the clock source on the servers they are talking to.
Getting a UTC time from GPS involves the leap second and during the event agreeing how how it’s implemented. Some systems smear, some double tick a 0 at midnight, it’s a disaster.
I completely agree that any coordination should be TAI. But that’s not the world we have to interface with unfortunately.
This hell isn’t imaginary, I’m speaking from experience of seeing stuff fall out of sync on the last leap second due to some systems using GPS with no smear and others using NTP to a source that smears.
Huh? Ok, so what do you propose: should time keeping diverge from the day-night cycle, or should we have leap seconds? There is no third option - the Earth just doesn’t turn the way you want it to.
It becomes your problem in a hurry once you encounter a 61-second minute and your stuff all breaks.
Not really disagreeing just expanding on the conversation…
I’m pretty happy with computers storing time as a “weird number that we add another number to to get the human time”… because we are already usually storing it as second/millisecond/etc offset from an arbitrary fixed point in history… we’re talking about going from “current integer (prefix)seconds value -> (some math involving leap year formulas and fiddly stuff not just x/365/24/etc) -> human display value” adding one single subtraction step to the some math part that that already has a bunch of other fast assembly arithmetic calculations involved in it.
As for divergence. We have two kinds of timekeeping. “Calendrical timekeeping” which will always get fixed to be close enough to the solar year that no one has a problem with it, and “Chronological timekeeping” which is basically the science of keeping a track of how many seconds have passed between arbitrary moments in time. All Calendrical timekeeping have for decades now been built on top of TAI and the science of Chronological timekeeping where metrologists have kept fastidious watch over atomic clocks and matched the beating of their resonating frequencies to the astronomical record keeping to allow ever more precise astronomy and the byproduct of this effort have been precise records of how long in scientifically precise seconds each day has been, and from these records we know we have to occasionally “add” a second if we want the “human time” to stay within 1 second of the astronomical time. Consequently we can simply decide we are happy with more or less drift. One second is small enough we can kinda just deal with it, but imagine having a leap minute every hundred years? Coordinate that mess, or how about a leap hour once a millennium? It’s trivial for people we don’t really time anything as humans faster than a second, so if we skip one it’s pretty much ok… it’s categorically on an issue for computers… which should never have used UTC and should have always used TAI because TAI doesn’t skip a second and so the fix would be “a fixed lookup table or a single if branch based on the date and the number of integers to add when doing the integer representation to human representation mathematics before updating the clock on your screen.
The general rule of thumb for helping people do periodic jobs reliably is to have the job be frequent enough you can't forget, on never have to do it. Too infrequent, and you forget what to do. Too frequent, and you get sloppy. I think the application to the leap seconds/minutes/hours situation is obvious.
I would love to run TAI on my devices. But where can I find a public NTP server that offers TAI, or a GPS module that gives the same? Do these exist? Or is this the wrong approach?
As mentioned in a reply to a sibling comment. GPS time is (to a human relevant usable level of accuracy) TAI minus 19 seconds. The issue is that you can't really "run TAI" on something, most firmware isn't designed to keep track of time this way and so expects to be working with UTC, the differences are subtle and basically boil down to "has no leap seconds and requires some math to accommodate this fact" and "expects leap seconds to exist, and be handled by humans and synchronization tools"... the latter which is fundamentally assuming that time is non-monotonic is at odds with the way that hardware clocks work in silicon and circuity.
So while you can design software to record things in TAI, and if you build your own hardware and firmware use TAI on that hardware and firmware... at a normal day to day level you can't really switch... but you can join the slowly growing collection of educated and aware people that appreciate the UTC/TAI mismatch in our software systems and push in the correct direction. The arguments about freezing UTC and not adding any more leap seconds to it are on the face of it, just stupid, we have GPS time which is fixed to TAI, we have TAI itself... we don't need to fundamentally change UTC just because we fucked up all the computers but just don't want to fix them properly. This is one time where counter to the Simpson's meme, the children are in fact wrong. UTC is harder for the computers to use, and that's because UTC was the wrong choice, so the answer is to stop having the damn computers use UTC as the internal time representation and switch to having them use an appropriate internal time representation.
I seriously think the most effective way to get everyone to TAI would be for Apple to do it: make it part of their WWDC keynote. Every political leader would learn about TAI within 24 hours, from a politically neutral, technological source they consider a lodestar. They would ask their people, who would document the ask in a formal inquiry to the science and technology people in their government, who would then respond with an executive summary of a detailed plan to implement TAI on all government systems.
It is indeed “International Atomic Time” in English, for what it’s worth I chose to use the French version of the acronym because using the English text requires more explanation why the acronym for “International Atomic Time” is not IAT but TAI, than just using the French name which matches the acronym and letting the curious google some more about the topic.
Also for bonus points the curious should read about UTC and how no one is precisely sure why that’s the version we got, spoilers: The English acronym and the French acronym both aren’t in the correct order for us to use the letters UTC, yet the acronym is UTC and no one is sure exactly why, like it may have been a compromise hashed out over drinks at some hotel bar between metrologists at a conference and written into the text that way without comment… it’s lost to history (at least the history I’ve been able to find when I ran down this rabbit hole)
UTC isn’t a mystery. There are various kinds of UT, depending on what corrections are applied to the raw observations: UT0, UT1, UT2. UT1 is the standard version of mean solar time, though GMT (when it was still a thing) used to be more like UT2. UTC is named according to the same scheme: Universal Time with a suffix denoting which kind of UT.
Computers should be using TAI and converting to UTC when humans want to see a local human time. My sincere belief is this was a bigger mistake than null, bigger than all the Y2K “we stored the year as 2 digits” stuff… The fact that from the BIOS/system Firmware up “we” decided to use a fuzzy time scale that changes on an unpredictable but just infrequent enough almost everyone can ignore it even exists except for when it changes and they remember it exists… genuinely a poor choice that I suspect is in large part due to a lot of people not knowing about TAI