Not really disagreeing just expanding on the conversation…
I’m pretty happy with computers storing time as a “weird number that we add another number to to get the human time”… because we are already usually storing it as second/millisecond/etc offset from an arbitrary fixed point in history… we’re talking about going from “current integer (prefix)seconds value -> (some math involving leap year formulas and fiddly stuff not just x/365/24/etc) -> human display value” adding one single subtraction step to the some math part that that already has a bunch of other fast assembly arithmetic calculations involved in it.
As for divergence. We have two kinds of timekeeping. “Calendrical timekeeping” which will always get fixed to be close enough to the solar year that no one has a problem with it, and “Chronological timekeeping” which is basically the science of keeping a track of how many seconds have passed between arbitrary moments in time. All Calendrical timekeeping have for decades now been built on top of TAI and the science of Chronological timekeeping where metrologists have kept fastidious watch over atomic clocks and matched the beating of their resonating frequencies to the astronomical record keeping to allow ever more precise astronomy and the byproduct of this effort have been precise records of how long in scientifically precise seconds each day has been, and from these records we know we have to occasionally “add” a second if we want the “human time” to stay within 1 second of the astronomical time. Consequently we can simply decide we are happy with more or less drift. One second is small enough we can kinda just deal with it, but imagine having a leap minute every hundred years? Coordinate that mess, or how about a leap hour once a millennium? It’s trivial for people we don’t really time anything as humans faster than a second, so if we skip one it’s pretty much ok… it’s categorically on an issue for computers… which should never have used UTC and should have always used TAI because TAI doesn’t skip a second and so the fix would be “a fixed lookup table or a single if branch based on the date and the number of integers to add when doing the integer representation to human representation mathematics before updating the clock on your screen.
The general rule of thumb for helping people do periodic jobs reliably is to have the job be frequent enough you can't forget, on never have to do it. Too infrequent, and you forget what to do. Too frequent, and you get sloppy. I think the application to the leap seconds/minutes/hours situation is obvious.
I’m pretty happy with computers storing time as a “weird number that we add another number to to get the human time”… because we are already usually storing it as second/millisecond/etc offset from an arbitrary fixed point in history… we’re talking about going from “current integer (prefix)seconds value -> (some math involving leap year formulas and fiddly stuff not just x/365/24/etc) -> human display value” adding one single subtraction step to the some math part that that already has a bunch of other fast assembly arithmetic calculations involved in it.
As for divergence. We have two kinds of timekeeping. “Calendrical timekeeping” which will always get fixed to be close enough to the solar year that no one has a problem with it, and “Chronological timekeeping” which is basically the science of keeping a track of how many seconds have passed between arbitrary moments in time. All Calendrical timekeeping have for decades now been built on top of TAI and the science of Chronological timekeeping where metrologists have kept fastidious watch over atomic clocks and matched the beating of their resonating frequencies to the astronomical record keeping to allow ever more precise astronomy and the byproduct of this effort have been precise records of how long in scientifically precise seconds each day has been, and from these records we know we have to occasionally “add” a second if we want the “human time” to stay within 1 second of the astronomical time. Consequently we can simply decide we are happy with more or less drift. One second is small enough we can kinda just deal with it, but imagine having a leap minute every hundred years? Coordinate that mess, or how about a leap hour once a millennium? It’s trivial for people we don’t really time anything as humans faster than a second, so if we skip one it’s pretty much ok… it’s categorically on an issue for computers… which should never have used UTC and should have always used TAI because TAI doesn’t skip a second and so the fix would be “a fixed lookup table or a single if branch based on the date and the number of integers to add when doing the integer representation to human representation mathematics before updating the clock on your screen.