The line I look at is
UT1-UTC = -0.1204 + 0.00020 (MJD - 59593) - (UT2-UT1)
Actually, I have some code that looks at Bulletin A for me, https://github.com/fanf2/bulletin-a which until recently was estimating that the next leap second would be a negative leap second in about 2028/9. But now its guess is receding into the future past 2030.
I have a few tweet threads on this topic, which you can find via https://twitter.com/fanf/status/1478767806187552776 with some clicky-clicky to dig the older ones out of twitter’s delightful user interface.
See also https://news.ycombinator.com/item?id=25145870 for some previous discussion here on this topic.
So nothing dramatic is likely to happen any time soon, though the current situation is quite weird.
Now, cheap DCF77 clocks (e.g., alarm clocks) don't care about the leap second bit anyway, but they will just adjust a few minutes later during normal resync. And they will do so just fine with a negative leap second. But some clocks, most probably those hidden in industrial systems, controllers, etc., may need to stay exact, and I am sure it's fun to find out which systems really care and will then fail and in what ways. :-)
Interestingly, the Japanese and the British atomic radio clock signals can do negative leap seconds:
For WWVB, whether negative leap seconds are supported seems to depend on whether amplitude or phase shift modulation is used.
Likewise WWVB has a leap second warning bit. But it includes a UT1 sign bit so the user can know if the leap will be positive or negative. This design is fine and any WWVB clock or software can properly handle positive or negative leap seconds.
This is true for both the amplitude and the phase modulation formats.
That part of the ITU used to be CCIR, the international consultative committee for radio, which goes a little way to explaining it, but it is still strange that it isn’t the responsibility of the IAU, IERS, or SI.
1. Seconds in one time system last the same amount of time as seconds in another time system
2. Seconds in the same time system last the same amount of time as each other
3. Time numbers are strictly increasing - the same time number can't happen again
4. [etc, kill me]
That's part of the genius of Leslie Lamport, embracing the fact that your distributed clocks aren't consistent, so start looking at effects through distributed systems from a light cone perspective just like you do with relativity.
So at the end of the day it doesn't matter if it's in 'error' or a fundamental to the physics even under perfect conditions. Neither are really under your control and both have the same answer.
> You can't escape answering the question: do I care about the time elapsed at home or the time elapsed here?
We already have that concept, in the difference between CLOCK_WALL and CLOCK_MONOTONIC. You use monotonic for "local temporal reference frame" and wall for "proxy of globally visible temporal reference frame". Wall shifts around anyway as NTP or what have you figures out how skewed off of "actual" time you are.
Yes you can. When error swamps the difference between two things, you don't have to choose.
Do you set your analog alarm clock to wake you at 6:00:00 or 6:00:05?
Anyone on Earth can just say their computer clocks are set to "Earth", ignore relativity, and use their normal clock skew correction algorithms.
It's not "You can't escape answering the question". It's "Interplanetary spacemen can't escape answering the question." And those spacemen might not even use seconds.
We should also define the length of a second to mean "at sea level in Greenwich".
The quest to eliminate irreproducibility from metrication is why SI did all that work to finally eliminate the kilogram prototype. The second was defined reproducibly decades ago because trying to define it based on our planet's inconstant spin was such a failure.
The fact that the Earth's spin is not constant is perhaps frustrating, but like other facts just insisting it isn't true doesn't help you.
But the size of the discrepancy between Colorado and London is far smaller than the discrepancy Leap Seconds are concerned with right?
Why would it? My redefinition just means the length of a second is defined as a given number of cycles of caesium in a gravitational field equivalent to that of Greenwich.
If you're in a different gravitational field, you'll just have to convert the values easily.
That's why I proposed changing the second definition to 9192631770 cycles in a gravitational field equivalent to that of Greenwich.
> Tidal datums must be updated at least every 20-25 years due to global sea level rise. Some stations are more frequently updated due to high relative sea level trends.
If we are going so far as to rigidly include "sea level" in the definition of "a second", it would probably help if we say that that we mean mean sea level.
Know what I mean?
HN discussion (among others less recent and/or less popular): https://news.ycombinator.com/item?id=19922062
More falsehoods: https://infiniteundo.com/post/25509354022/more-falsehoods-pr...
Instead have 2 calendars: the normal date time is standardised, no leap anything and simply increases. The orbital time tracks the earths revolution around the sun.
Sure, eventually the two will drift. But leap days account for 3 weeks-ish of time for the average person through out their life (let’s assume 80 years life span for arguments sake). So this doesn’t seem so hard to adjust to.
Measurement systems ultimately serve the needs of those who are doing the measuring. This fact alone accounts for many of the idiosyncratic aspects of traditional measurement systems. An acre-foot measurement (a volume of water) makes sense when you've got a pond whose surface area (in acres) is known, and whose depth (in feet) can be measured against a post or dam face. An acre itself is the amount of land a farmer can plough in a day (in German, Tagwerke), and would vary by terrain, farmer, plough, draught animals, and hitches.
Time systems suitable for other needs may come to be used. So long as a substantial human population remains on Earth, an Earth-centric time system will all but certainly exist, and adapt to changes in the Earth-Moon-Sun system's orbital and rvolutionary characteristics.
I suspect the real reason my be overlapping domains of varying precision/accuracy, which tolerate different degrees of error on any specific time measurement, but still need to agree on a shared center of the distribution of that error.
So in our daily lives, we might tolerate a fraction of a second of drift without notice... But there are some specialty domains that do require more precision, and we need the time measurements in both domains to mostly line up, to avoid confusion. So our daily clock needs to be slaved to the more precise clock, even if we don't require that precision in most things.
But we do all sort of dumb time related things:
- day light savings
- leap seconds, days, years
Some places too close to the poles don't have normal day/night cycles of light. So we already tolerate a lot of weird time stuff. This way we keep tolerating it and get rid of the calculations.
You need to learn the actual reasons why precision synchronous time exists. Until you do that, you're just wandering around in the dark, being silly.
In my system, the time is simply a number that increments. No calculations are ever needed to correct anything.
This is what most people use to schedule things.
The second "clock" tracks position of the earth around the sun. You need it for the seasons. But this too is always exact. There will never be a situation where we need to adjust it. You can never reach the end of a year and be slightly short on the distance traveled around the sun. So one day Xmas is slightly warming than usual. But so what?
But the real magic of the calendar is that billions of individuals use the same calendar for the same problem and thus we can all agree what day it is, every single day.
So if there's an issue with the existing calendar, then simply adjusting our calendar by an amount really no-one will notice is by far the most practical solution. Almost every industry cycles with the season; we need the calendar to track it accurately.
The article is all about trying to get 2 things to sync because of the slowing down of the earths rotation. You haven't actually described why we need to go to this effort beyond maintaining the current system. Nor why this is a practical thing to do in general beyond maintaining the current system.
Loads of people accept weird times shifts all the time. The poles have 6 month days and nights. In the summer I get 3 extra hours of day light and treat 7pm as "day". We already adjust what we do by what it looks like outside. Why does the time need to "match"?
If our calendar is off by a second or two a year, it's very easy to correct without confusing the general populace - most people want to spend precisely zero time thinking about calendars and orbits. Currently, only a few nerds have to worry about leap seconds and the like. And it's part of their jobs, so it's OK.
In a two-calendar system, every human will have to keep track of a personal calendar and a scientific calendar. Asking people who don't care to do that - and to fix an issue we already have an almost zero-friction fix for - is a non-starter.
Keeping the time in sync with the orbit of the earth isn't just about the "here and now" but also history.
...aaaand there's an xkcd:
Applying it to millions of individual servers with different OSes? I'll make the popcorn.
A leap second is over in a second. With leap smear you are out of sync with the rest of the world for many hours. And, there are at least three different leap smear schemes I know of: a massive judgment failure that happens again and again.
The sensible alternative would be to eliminate leap seconds entirely for a few decades, and maybe stage a leap minute, eventually, or a leap hour in some far-off century, or never. It doesn't matter if the sun is ever exactly overhead at noon, because it isn't anyway, almost everywhere.
Another issue is that a positive leap second cannot be represented using time_t-like representations. But at least every time_t value maps to a UTC time.
By contrast a negative leap second, should one occur, will cause a permanent illegal time_t value that has no existence in UTC. Checking for that illegal value in every piece of code that uses time_t values will not be fun.
Positive leap seconds, when translated into a dumb format and back, can be off by a second and won't round trip properly.
Negative leap seconds, when translated out of a dumb format and back, can be off by a second and won't round trip properly.
Seems pretty close to me.
There's also the exciting version where some of your time servers do it one way, and some do another; you could probably pick two traditional servers, and then one of each of the others and have a really terrible day. From experience with a setup where 4 servers were traditional and one was smeared, there was a lot of variation in time; but thankfully my systems weren't relying on time measured on different servers to be very consistent.
IMHO, leap smear probably should have been done with standard time on the wire, and the host agent doing the smear. But that would be more effort than doing it on the time servers, so there you go.
Pre-Y2K there was a lot of fuss and hand-wringing, but there was also a lot of software development that happened to ensure that Y2K would not be a disaster. So then when Y2K rolled around the preparations mostly worked.
My first programming job in high school was over-hauling a (mainframe-hosted) code base for handling lab data at a hospital. This was in the mid-90's; before most of the Y2K fuss had started. Without my work, pee and poo samples would be time traveling from the late Victorian era. You're welcome.
But then all anyone ever sees is that nothing went wrong, and they assume it was nothing to begin with.
The problem isn't one where you hold your breath and squeeze your eyes shut for a second while the unpleasantness passes. The problem is when you have tons of code running and you don't know exactly how it'll deal with this very rare event.
If you're faced with an upcoming leap second there are various ways of dealing with it. One way is to do an audit of the code and hope that you can verify the most important parts. Another way is to smear and hope that most programs won't be negatively affected by reported seconds being 0.00001s off.
There are other options, but I don't think there's an "obviously correct way".
> There are other options, but I don't think there's an "obviously correct way".
Well, there are ways that following reality more and less. There are also ways that have legal traceability to time sources in some regulated industries.
Just change your time zone offsets at that point, honestly.
The FreeBSD folks test for this:
The kernel and ntpd are fine with it. Applications: ¯\_(ツ)_/¯
Too late, your IoT-enabled microwave just crashed and can't heat up the kernels
It makes time calculations unreasonably complex without any benefit.
1. 86400 seconds a day
2. SI seconds
3. Noon synced up with the sun
With UTC, we drop 1.
With UT1, we drop 2.
With TAI, we drop 3.
Maybe we should have gone with TAI, or with UT1 - I think that's what Julia does, sort of:
Right, and nobody's stopping you from using a sundial instead of a GPS watch if you don't care about high precision timekeeping. And if you do care about high precision timekeeping but hate leap seconds, you can use International Atomic Time (TAI). It all comes down to standardization. You might enjoy reading this:
Basically, standardization is hard. Not just for time, but any measurement (weight, length, etc). How do you absolutely guarantee that an airplane part made in France and the same airplane part made in Brazil are exactly the same dimensions/weight? You need standards. And those standards have slightly changed over time in an effort to achieve higher consistency/future-proofing.
Time is especially difficult because gravity and speed affect time thanks to relativity. So how can you come up with a universally agreed upon definition of "one second"? The current best solution we have is to take a weighted average of 300+ atomic clocks. That's how we define TAI, and by extension, the second. However, people expect noon to be "the time when the sun is at its highest point in the sky". Leap seconds are used to make non-TAI time systems align exactly with the rotation of the earth. Without leap seconds, your local day will slowly drift until (millennia later) "noon" is happening in the middle of the night. And without leap years (which align the planet's orbit around the sun) your seasons will slowly drift for the same reason.
A) No they don't, which makes sense because
B) No it isn't
And "millennia later" is really overstating how soon this would happen. If you did this, every few millennia if people are very angry about it they would merely need to change their time zone definition one step. Now, how many millennia does it take to change time zone definitions where you live? Oh that's right they weren't even invented a thousand years ago.
So, firstly, people do not actually expect the sun to be "at the highest point in the sky" at 1200 local time, it hasn't been in their lived experience so why would they? They'd be surprised if it happened at 0800 or 1600, but precisely 1200 doesn't matter to them. Which is good because it isn't, it hardly could be.
Nobody was proposing to eliminate leap years (although perhaps renaming them "leap days" and marking them as a holiday might be better) which are a whole different problem.
There is a confounding issue, though: the continual messing around with daylight saving means that time zones have to be relatively easy to alter. But if (as seems likely) daylight saving is abolished, this flexibility will ossify, so it might not be so easy to use in the distant future…
...so you want to replace leap seconds with leap hours?
Time zone changes occasionally happen already.
2. It's a time zone change, and those already happen more often than leap seconds.
I don't know about you, but I never change the time zone on my servers. Ever.
Either way, these code paths get tested relatively often.
If the Earth speeds up enough, we might find ourselves pondering over the possibility of a negative leap second. According to the Time and Date folks , a day in 2021 is averaging about 0.2ms faster than the 84600 atomic seconds per day, ~70ms/year, so at most 14 years of this would put us over the threshold (super unlikely). In reality, we don’t have to speed up a full 1000ms of rotation speed because there was always a fractional difference in UT1-UTC.
I recently took an introductory Latin course and learned that ancient Roman time operated on the same principle. As the days got longer in the summer, so did the absolute length of time encompassed by an hour. Hard to make a sundial to automatically compensate, I imagine.
More precisely the difference between two sunrises or two sunsets was 24 hours, so the hours were longer in winter and spring as days get longer, and shorter in summer and fall as days get shorter.
> TAI is a statistical timescale which is produced at the BIPM by combining the reports from many atomic clocks around the world. Anyone who wants to find out what TAI it is now must mark the time with some local clock that contributes to TAI. Then some weeks after now when all clocks have been combined the difference between the local clock and TAI will be known. At that point it can be said what TAI it was at the moment of interest.
> There are several interesting features here. First of all, before 1977 the clocks contributing to TAI were not corrected for the gravitational redshift. Because most clocks are above sea level they tick faster than TAI should tick. Through the mid-1980s there is an annual wobble due to seasonal environmental changes at some of the clock sites. In 1995 a CCTF working group deemed that the clock frequencies should be corrected for thermal radiation, and the CIPM affirmed this in 1997. The steering of TAI to the corrected frequency occurred over three years from 1995 to 1998, and the final levelling of the curve over those years indicates that the frequency of TAI is now consistent with cesium atoms at 0 Kelvin.
Basically even with TAI there are all sorts of time deviations as an artifact of history.