I did a deep dive on this once: Coordinated Universal Time (UTC) is coordinated among ... who? The answer (at least when I looked into it): University of Paris, NIST, and the US Naval Observatory. The other two are constantly comparing state of the art atomic clocks. USNO is the only member looking outward.
Those are the time people. There's also the Earth Rotation Service.[1] They track what the planet is doing. The Earth Orientation Center, in Frankfurt, Germany, tracks where the poles are pointed and the rotation speed. These are the people behind leap seconds.
The International Earth Rotation Service is possibly my favorite name for an international bureaucracy. It give me visions of a Giant Control Lever extending deep into the Earth, with the labels Faster and Slower.
What happened to the rotation rate around the beginning of 2019, to cause the big change in the slope of the UT1-UTC line (https://crf.usno.navy.mil/ut1-utc)? After adding 5 leap seconds in 11 years, no leap seconds have been added or subtracted since the end of 2016.
Its probably not the only component... but a fun fact is that the Earth's rotation is affected by the average equatorial ocean temperatures, (anyone seen the ocean temperature charts this year... something something ships and sulfur dioxide accidental geoengineering that we stopped doing) warm ocean water displaces slightly more volumetrically and thus by way of its lower density, affects the spinning mass balance of the Earth like a ice skater curling in their fingers rather than their whole arms while spinning like a top, the effect is there... just so very subtle.
This kind of rate change is referred to as a “decadal variation”, because the changes last ~decade(s). They seem to be due to activity in the earth’s mantle, but it is all very poorly understood.
You have been lied to… your true enemies are the ones that continue to build monotonic systems on top of non monotonic time scales… Join us in the Temps Atomic International! TAI is the one true timescale.
Computers should be using TAI and converting to UTC when humans want to see a local human time. My sincere belief is this was a bigger mistake than null, bigger than all the Y2K “we stored the year as 2 digits” stuff… The fact that from the BIOS/system Firmware up “we” decided to use a fuzzy time scale that changes on an unpredictable but just infrequent enough almost everyone can ignore it even exists except for when it changes and they remember it exists… genuinely a poor choice that I suspect is in large part due to a lot of people not knowing about TAI
The GPS-only clock source entered the room… I.e. most aircraft, spacecraft, electrical infrastructure, cell towers, etc. Welcome straight back to “what do we do with the leap second?” hell.
Unfortunately none of the interfaces between parties agree on TAI. It’s stupid GPS time or UTC.
GPS never had leap seconds. Your hell is imaginary. It is a myth and a lie perpetrated by those that never looked beyond the convenient lie to children that is UTC and tried to understand the true nature of timekeeping.
Actually it’s a bit more complicated than that. As well as the integer number of seconds between GPS time and UTC, the GPS signal includes a small linear correction (rate and phase) that gives you a more precise version of UTC(USNO). So to get something like TAI from GPS time you must also apply the rate and phase correction. But that still isn’t TAI as such, because TAI is only defined after the fact, in the form of retrospective corrections for each time lab’s implementation of UTC. So you would need to get Circular T from the BIPM and apply the USNO correction.
I mean look... we can start getting really fun with it and get into the metrology of it all, but I'm having some fun and trying to spread the good word here.
Early programmers totally fucked up using UTC instead of TAI which as a monotonic count of seconds is a far more accurate representation of how computers keep track of time via continuous oscillation counters than any approximation of UTC until we developed widespread network synchronized NTP based systems and began to use UTC via these, and even then its not a really good representation because all these systems want to work with clean contiguous "one second is one second" time keeping, which is what monotonic time scales are, they inherently are this. So the choice of using a non-monotonic time scale on top of hardware that keeps time monotonically was flawed from the start.
We can get into the Sidereal time, Ephemeris time, the International Earth Rotation and Reference Systems Service, UT, TT, ERA, DUT1, MTC (Marian Co-ordinated Time), and what the difference is between an instance and an interval that represents a specific moment in time with an any temporal uncertainty attached... I do freely admit I'm a total unrepentant geek about this sort of thing, the only reason I don't own an atomic clock is because I just cant justify the few grand to buy one (cheapest I've seen is $1.5kAUD for an DIY module from Microsemi) when I know how precise my GPS based Stratum 0 embedded Linux NTP server is... Heck the IERS Bulletin emails are some of the only automated emails I look forward to and open up with a little glee, finding out what's up with the literal rotation of the planet, neatly dropped in my inbox!
But as nerds who already know a bunch of metrology/timekeeping stuff (which I am assuming from your comment) we could launch ourselves down the rabbit hole, and hope the readers can keep up, or as I've tried to with my comments here (particularly with the slightly whimsical choice of wordplay)... Try and give causal nerds who have only just tripped over the very idea that there is a better timescale that computers should be using instead of UTC some idea where to begin reading, and to wash away some of the false truths (like that GPS has leap seconds) and while technically TAI might not be easy for computers to use due to the retroactively corrected nature of it... at the end of the day all timekeeping is subject to measurement error, approximation, and correction either due to or in correlation with external sources, and thus because we only periodically correct computers to UTC mostly (on average) between 1hr to 1day via NTP depending on OS and preferences,
correcting a monotonic system clock to the daily correlated and corrected TAI would be infinitely better due to eliminating all the leap second complaints, than repeatedly correcting the system clock to UTC and continuing to have the leap second problem, because UTC has a place in the timekeeping world a place that its miss-use by computers has now put in jeopardy (the calls driven by tech companies to fix UTC and have no more leap seconds).
Prior to the invention of NTP, system administrators would have just corrected it to TAI manually, and done so just as accurately as we would have corrected it to UTC, so arguments like "it would be harder to use TAI than UTC" have honestly never been true, its really just a problem of awareness and now that we're in a world where system firmware is broadly speaking designed to keep time via UTC and thus when you interrogate "system time" you get UTC, everything built up from this layer is "wrong" so its a bootstrapping problem on the level of switching to IPv6, yes its technically better, but none of the hardware supports it and so why bother when I can just keep hacking away at UTC/IPv4... except there's no UTC scarcity to drive any sort of adoption, so all that can be done is to proselytize the benefits of TAI to the best of ones ability and hope that eventually it makes a difference.
I should have been more clear. GPS is the source, but the APIs that these systems use with the ground are usually UTC due to the nature of the clock source on the servers they are talking to.
Getting a UTC time from GPS involves the leap second and during the event agreeing how how it’s implemented. Some systems smear, some double tick a 0 at midnight, it’s a disaster.
I completely agree that any coordination should be TAI. But that’s not the world we have to interface with unfortunately.
This hell isn’t imaginary, I’m speaking from experience of seeing stuff fall out of sync on the last leap second due to some systems using GPS with no smear and others using NTP to a source that smears.
Huh? Ok, so what do you propose: should time keeping diverge from the day-night cycle, or should we have leap seconds? There is no third option - the Earth just doesn’t turn the way you want it to.
It becomes your problem in a hurry once you encounter a 61-second minute and your stuff all breaks.
Not really disagreeing just expanding on the conversation…
I’m pretty happy with computers storing time as a “weird number that we add another number to to get the human time”… because we are already usually storing it as second/millisecond/etc offset from an arbitrary fixed point in history… we’re talking about going from “current integer (prefix)seconds value -> (some math involving leap year formulas and fiddly stuff not just x/365/24/etc) -> human display value” adding one single subtraction step to the some math part that that already has a bunch of other fast assembly arithmetic calculations involved in it.
As for divergence. We have two kinds of timekeeping. “Calendrical timekeeping” which will always get fixed to be close enough to the solar year that no one has a problem with it, and “Chronological timekeeping” which is basically the science of keeping a track of how many seconds have passed between arbitrary moments in time. All Calendrical timekeeping have for decades now been built on top of TAI and the science of Chronological timekeeping where metrologists have kept fastidious watch over atomic clocks and matched the beating of their resonating frequencies to the astronomical record keeping to allow ever more precise astronomy and the byproduct of this effort have been precise records of how long in scientifically precise seconds each day has been, and from these records we know we have to occasionally “add” a second if we want the “human time” to stay within 1 second of the astronomical time. Consequently we can simply decide we are happy with more or less drift. One second is small enough we can kinda just deal with it, but imagine having a leap minute every hundred years? Coordinate that mess, or how about a leap hour once a millennium? It’s trivial for people we don’t really time anything as humans faster than a second, so if we skip one it’s pretty much ok… it’s categorically on an issue for computers… which should never have used UTC and should have always used TAI because TAI doesn’t skip a second and so the fix would be “a fixed lookup table or a single if branch based on the date and the number of integers to add when doing the integer representation to human representation mathematics before updating the clock on your screen.
The general rule of thumb for helping people do periodic jobs reliably is to have the job be frequent enough you can't forget, on never have to do it. Too infrequent, and you forget what to do. Too frequent, and you get sloppy. I think the application to the leap seconds/minutes/hours situation is obvious.
I would love to run TAI on my devices. But where can I find a public NTP server that offers TAI, or a GPS module that gives the same? Do these exist? Or is this the wrong approach?
As mentioned in a reply to a sibling comment. GPS time is (to a human relevant usable level of accuracy) TAI minus 19 seconds. The issue is that you can't really "run TAI" on something, most firmware isn't designed to keep track of time this way and so expects to be working with UTC, the differences are subtle and basically boil down to "has no leap seconds and requires some math to accommodate this fact" and "expects leap seconds to exist, and be handled by humans and synchronization tools"... the latter which is fundamentally assuming that time is non-monotonic is at odds with the way that hardware clocks work in silicon and circuity.
So while you can design software to record things in TAI, and if you build your own hardware and firmware use TAI on that hardware and firmware... at a normal day to day level you can't really switch... but you can join the slowly growing collection of educated and aware people that appreciate the UTC/TAI mismatch in our software systems and push in the correct direction. The arguments about freezing UTC and not adding any more leap seconds to it are on the face of it, just stupid, we have GPS time which is fixed to TAI, we have TAI itself... we don't need to fundamentally change UTC just because we fucked up all the computers but just don't want to fix them properly. This is one time where counter to the Simpson's meme, the children are in fact wrong. UTC is harder for the computers to use, and that's because UTC was the wrong choice, so the answer is to stop having the damn computers use UTC as the internal time representation and switch to having them use an appropriate internal time representation.
I seriously think the most effective way to get everyone to TAI would be for Apple to do it: make it part of their WWDC keynote. Every political leader would learn about TAI within 24 hours, from a politically neutral, technological source they consider a lodestar. They would ask their people, who would document the ask in a formal inquiry to the science and technology people in their government, who would then respond with an executive summary of a detailed plan to implement TAI on all government systems.
It is indeed “International Atomic Time” in English, for what it’s worth I chose to use the French version of the acronym because using the English text requires more explanation why the acronym for “International Atomic Time” is not IAT but TAI, than just using the French name which matches the acronym and letting the curious google some more about the topic.
Also for bonus points the curious should read about UTC and how no one is precisely sure why that’s the version we got, spoilers: The English acronym and the French acronym both aren’t in the correct order for us to use the letters UTC, yet the acronym is UTC and no one is sure exactly why, like it may have been a compromise hashed out over drinks at some hotel bar between metrologists at a conference and written into the text that way without comment… it’s lost to history (at least the history I’ve been able to find when I ran down this rabbit hole)
UTC isn’t a mystery. There are various kinds of UT, depending on what corrections are applied to the raw observations: UT0, UT1, UT2. UT1 is the standard version of mean solar time, though GMT (when it was still a thing) used to be more like UT2. UTC is named according to the same scheme: Universal Time with a suffix denoting which kind of UT.
I’m sorry, we can only accept them as “enemies” if you have a suitably Machiavellian plan to defeat them. 12 points and a snazzy PowerPoint presentation in the nearest secret volcano lair if you please.
Most of the people responsible for implementing leap seconds want to get rid of them too. The international bureaucratic wheels are grinding slowly towards abolition: should be more successful this time than the previous attempt ~15-20 years ago.
The IERS web site says the EOC is at the Paris Observatory, but much of the data (such as Bulletin A) comes from the Rapid Service/Prediction Centre at the USNO.
UTC was originally (early 1960s) coordinated between the British and Americans. This was related to their existing collaborations in time and navigation. The definition of the atomic second was established in 1955-1958 by a collaboration between NPL (London) who had the first good atomic clock, and the USNO who were good at the astronometric work to establish the ephemeris second. The USNO and the Greenwich observatory also prepared the Astronomical Almanac together.
Later on, as other countries wanted to use the same atomic timescale, the BIH (bureau international de l’heure, based at the Paris observatory) took over. More recently, the BIH was split.
The part responsible for atomic time moved to the BIPM (bureau international des poids et mesures, in Sévres just outside Paris) who are responsible for the definition of the second, so it makes sense for them to be responsible for its realisation as well. They maintain atomic time based on contributions from many atomic clocks in many time labs around the world.
The astronomy parts of the BIH became the IERS (international earth rotation service), still based at the Paris Observatory. However, a significant part of their practical scientific work is carried out by the USNO. For instance IERS Bulletin A is sent out each week by the USNO. Earth orientation is needed for the GPS as well as for leap seconds.
I love stuff like this so much. You take something as seemingly complex as time or weight and it ends up boiling down to like a handful of people diligently crunching numbers or a hunk of exactly 1 KG of platinum sitting in the basement of some French lab. Dispelling the "magic" behind these things utterly fascinates me.
While the Navy's time keeping mission is fascinating on its own (and itself was(is) a prerequisite for navigational abilities) the understated 'read-between-the-lines' reason the Navy looks "outward" is because that's how it points the nukes.
> Currently, NOVAS has three different editions, for C, Fortran, and Python.
I imagine the Python versions wraps one of the other two. But are the C and Fortran versions separate, or does one wrap the other? If not, how are they related? Parallel implementations kept in sync by hand? Used to check each other? Machine translated? Code-generated from an upstream definition?
My knowledge is a few years out of date (but I would be surprised if the situation is different now). The Fortran is the primary version, the C is a parallel implementation that follows the Fortran, and the Python wraps the C. Calculations from each are compared to each other as well as calculations produced by https://iausofa.org/
Cool patch in the top left. And non-disparagingly I found this charmingly old school in the bottom right: Regular website maintenance is scheduled for
Thursdays from 3-5pm Eastern Time, during
which time the website may be unavailable.
I think the way the Celestial Reference Frame Department does things might be old-fashioned.
With new tech like blockchain for decentralized coordination, or machine learning algorithms that can process astronomical data more accurately and quickly, there might be better and easier ways to do celestial referencing. It might be time for a change to keep up with these new advancements.
Machine learning algorithms are good for cheaper replacement of cheap customer service agents, but they are not trustable.
For important things (and having accurate and precise time is an important thing) you need to know in advance that the task will be completed correctly; you actually need to know why it will be completed correctly.
Anything based exclusively on old data, with no formal model, simply cannot give you any guarantee.
What do blockchains help for coordination (among three players btw, as other posts point out, why even decentralized)?
What is wrong with the current proven algorithms/approach doing the celestial referencing (which is not rocket science btw)? How would machine learning help and overcome the necessary accuracy and certainty needed here?
To summarize: OMFG!!!11 Or you just buzzwording bot?