Hacker News new | past | comments | ask | show | jobs | submit login
Will we have a negative leap second? [pdf] (2022) (gps.gov)
89 points by fanf2 10 months ago | hide | past | favorite | 72 comments



"Add an entire day to the calendar, no one bats an eye. And one second and everyone loses their minds".

I think the main issue with dealing with leap seconds is really a lack of awareness. Our calendars are already full of idiosyncrasies, but we've learned to deal with them just fine. When UTC was created, so was Atomic Time (TAI). The overwhelming majority of systems are unaffected by leap seconds, for the few that are, it would simply be a matter of using TAI instead of UTC, and having software to readily convert between the two.


The problem with leap seconds is that any conversion between TAI and UTC for a time stamp more than 6 months in the future is undefined.

If we did the same thing with time zones or leap days, that some committee would only announce changes a few months in advance then people would get very upset.


It does happen, just not all that frequently. The US and Canada changed daylight saving time's dates as recently as 2007 and time zone rules are changed in various parts of the world now and then. It happens frequently enough that there's a database maintained to track it and let systems programmed to use the database to adjust to the changes.


Think about if it happens (or does not happen) every 6 months and you can never plan ahead.

And daylight saving is a local problem. In my country the rules have not changed for more than 4 decades.

The other problem is that most systems use time zones and daylight saving mostly as a presentation format. Only time that is explicitly for human consumption deals with time zones. Internally computers mostly use UTC.

The problem with leap seconds is that computers often represent time as some number of seconds since an epoch. That requires a stable conversion between year-month-day-hour-minute-second to seconds since epoch.

And that's where leap second are a problem. Because future leap seconds a undefined, this conversion is undefined for dates in the future.

That then results in a collection of hacks to let it work most of the time.

Where I live, civil time is currently off by about one hour and 41 minutes, compared to solar time. It is safe to say that essentially nobody cares about solar time with more precision than one hour. If you want to know about sunrise or sunset, you just look it up. Noon is mostly irrelevant.


Some Muslim countries base their daylight savings on an Imam observing sunrise (or not) at a particular time.

The DST offset isn't known until that moment.


Systems that do this use the latest value for leap seconds. The conversion becomes ambiguous, but not undefined. For any system sensitive to the ambiguity, you simply would keep the time in TIA rather than convert it.


>some committee would only announce changes a few months in advance then people would get very upset.

This does happen, with daylight savings.


This following book may be of interest to some people in this discussion thread. It's a re-issue of a book published decades prior, so of course it is not up-to-date. But it is deeply researched and really quite fascinating.

Munk, Walter, and Gordon J. MacDonald. The Rotation of the Earth: A Geophysical Discussion. Digitally printed version. Cambridge Monographs on Mechanics and Applied Mathematics. Cambridge: Cambridge University Press, 2009.


The author of the PDF was the head of Time Services at USNO for decades.


What's mts?


The math in his presentation seems to work out if you interpret it as milliseconds (e.g., the day is getting shorter by an average of 2.2 milliseconds every century).

I'm not sure why that particular abbreviation. I tried searching Google to find people using "mts" to refer to milliseconds, and it seems like it could be "millisecond timestamp" or "metric time series"??


Maybe a mistake? I would usually read "mts" as "minutes".


It looks like minutes but it can't be. It's milliseconds if you look at the numbers: "1.4 mts/day/century (0.5 sec/year/cty) historical slowdown"


This may be totally dumb, but if the variation is attributed to the tides, is it possible that global warming affected it? With the ice cap melting, there is more sea water sloshing around instead of just floating at the top.


It's not. They are statistically insignificant compared the rotation of the core mass.


Wrong. For rotation you don't look at mass, but at moment of inertia. That scales like ~M*R^2 or ~rho*R^5 for a density of rho and distance R from the axis of rotation. So even if the core is much denser, it contributes a lot less to the earth's total moment of inertia. Earth's rotation is even measurably affected by plants changing throughout the seasons. Glacial melts and earthquakes also have measurable effects that need to be compensated for. See here for the plot:

https://upload.wikimedia.org/wikipedia/commons/thumb/5/5b/De...

And here for a thorough explanation:

https://www.sciencedirect.com/science/article/pii/S167498471...


That second paper summarises that the changes are quite probably due to deformation which is caused by core movement. We are not debating seasonal variation.


That is one source. But not the source.


M*R^2 is understandable but what's this raised to 5th power stuff in rho*r^5? And is this lowercase r the same as the uppercase R?


Mass scales like density times radius cubed. The first M would technically be a function of position. So you would integrate over slices of varying M not just because of varying density.


It is statistically significant. https://www.nature.com/articles/d41586-024-00850-x:

“Earth’s rotation seems to have accelerated, outpacing the time standard, and raising the possibility that an unprecedented ‘negative’ leap second might soon be required — a daunting prospect in a world reliant on consistent timekeeping.

• Agnew1 reports that human-induced melting of polar ice exerts a slowing effect on Earth’s rotation, effectively delaying a decision on the need for a negative leap second.”

Global warming moved the date that we would need a negative leap second out by 3 years.


This paper was by a cosmologist with heaps of papers in other areas and just a single paper in this area, highly publicised in the climate change news.

All the reports around this were hyperbolic. I'd be waiting for a bit more research in this area. It reminds of other recent paper that said possible close passes by other stars had a huge impact on the environment according to simulations. No mention of the problem predicting orbits over tens of thousands of years.


The slides just say "predictable speedup starts 2020" but don't really explain it, except for a vague reference to the "19 year metonic cycle".

The question I really wanted answered going in is: why is the Earth's rotation speeding up since 2020?

If we look at 19y intervals in the past at the UT1-UTC plot, we can see something unusual around 2000, but nothing really at 1980. And leap seconds started in the 70s! So this is a fascinating set of slides but the question is still frustratingly open for me.


> The question I really wanted answered going in is: why is the Earth's rotation speeding up since 2020?

Wikipedia's chart of the length of the day explains this part [0]. Overall, the day has been getting shorter since 1970, though it oscillates somewhat due to the metonic cycle. (Which, as I understand it [1], is just the tides changing the shape of the Earth over the course of the Moon's orbit, causing a change in angular velocity, which adds up over the months.) It's only since 2020 that this has regularly gone under the 86400-second mark, making the integrated ΔT start to decrease. And the SI second itself was based on the year 1900; I assume even larger-scale effects caused the day to originally become longer since then.

Luckily, we seem to have already hit the low point of the current cycle around January 2023, so there shouldn't be much risk of a negative leap second in the next decade.

[0] https://en.wikipedia.org/wiki/Day_length_fluctuations?search...

[1] https://en.wikipedia.org/wiki/Tidal_acceleration


Here's a fun thought: As humans dig heavy materials out of the earth and build taller buildings, we increase the moment of inertia of the planet. This theoretically slows the rotation of the earth.

Of course, given the huge mass of the earth, the effect so far would be negligible. But maybe it would be noticed in the distant future if it were done on a much much greater scale than it is today.

As more food for thought, we are also changing the moment of inertia by putting so many planes in the sky, and by sending mass to space. If every plane were to land, the earth should rotate slightly faster.


Drift of Earth's Pole Confirms Groundwater Depletion as a Significant Contributor to Global Sea Level Rise 1993–2010

https://agupubs.onlinelibrary.wiley.com/doi/10.1029/2023GL10...

    Humans Have Shifted Earth's Axis by Pumping Lots of Groundwater. For decades, humans have been pumping so much water out of the ground that it has caused Earth's axis of rotation to shift, according to a new study published last week in the journal Geophysical Research Letters.
https://www.smithsonianmag.com/smart-news/humans-have-shifte...



But on the other side this contributes to global warming which melts of a lot of water-ice that is in high places and will (as water does) go to low places.


Someone knowledgeable should do the math


I think it's not answered in the slides. Instead they offer two hypotheses for the _characteristic_ of the observation of the declining length-of-day in the last 20 years, with all known effects removed such as the Metonic Cycle: Random walk & Linear.

They present the consequences of adopting one or the other prediction model, but only because one has to be adopted for practical decisions. Not because we necessarily know what's going on.

We can't know now which is the correct model. As always, only time and observation will tell.


I haven’t seen an explanation. I think it’s more surprising than the slides suggest, because if you look at the IERS charts there’s a clear change in the dX and dY plots in about 2019, corresponding to a change in gradient of UT1-UTC.

https://datacenter.iers.org/plots.php?id=6


"The Earth’s spin is slowing down. This is annoying." https://what-if.xkcd.com/26/


Dealing with leap seconds is very expensive. What happens if countries like China, Russia or Iran just decide to ignore leap seconds? We will have a clusterfuck of time zones divergent by a few seconds!

I am really into astronomy. But dealing with this, just so stars pass local meridian exactly at 00:00:00.000 is simply not worth it!

And one funny note, astronomers still use use Julian calendar (one made by Ceasar without Gregorian corrections in 16th century) to avoid similar issues. They avoid their own inventions!


I think the terminology is pretty confusing, but as I understand it, astronomers use Julian Day numbers [1], which is not really using the Julian calendar - it's really no calendar at all, just a continuous count of days, so you don't need to think at all about the definition of leap days or leap years (until you want to convert back to a human-understandable form). "Julian" coming from how they choose the reference point in the Julian Calendar.

[1] https://en.wikipedia.org/wiki/Julian_day


There are also variants, for some science cases it is preferrable to get out of the Earth referential and move to e.g the barycentric reference frame and use the Barycentric Julian Date [1], which can itself be expressed in different standards (UTC, TAI (atomic time), TDB etc ...)

[1] https://en.wikipedia.org/wiki/Barycentric_Julian_Date


There’s an easy solution nobody seems to be considering. We put rockets on the moon to continually adjust its orbit in order to preserve perfect adherence to a 24-hour solar day.

Hell, once we’ve perfected this technique we could even use it to gradually pull us into a perfect 365-day year! Or better, 364 days, which is evenly divisible by 12 (edit: oops, 13) months.


I'm glad to see someone tackling the problem head-on instead of just coming up with workarounds.


> Or better, 364 days, which is evenly divisible by 12 months.

   364/12 = 30⅓
I wouldn’t call that much better. :-)


Whoops, I meant 13!


Leap seconds are in the process of being abolished. The ITU-R confirmed their part of the agreement at the world radiocommunication conference last year (the ITU-R’s triennial treaty update); the next stage is the CGPM in 2026 (the BIPM’s triennial treaty update).


Will there be another solution for the desynchronization of SI days and solar days? It looks like there’s been about 25 seconds difference in the past 30 years or so.


An option if it gets way off would be to just express it as TZ shift. Current rate seems to be about a second every two years, so we have some hundreds of years before we need to even shift by 15 minutes. An amount of divergence that I suspect would not really be even noticeable, especially as it develops over hundreds of years. The TZ code is used widely and regularly and systems tend to know how to handle it. Leap seconds are always a bit of an adventure from what I understand. (I'm not a sysadmin, so I don't know the details.)

Like -- I don't know, but is it actually important that the solar day is tied to "wall clock day" so snugly? So what if it's a couple minutes off...?


> Like -- I don't know, but is it actually important that the solar day is tied to "wall clock day" so snugly? So what if it's a couple minutes off...?

Even when you're assigned a correct hour-aligned time zone, you'll be off by an average of 15 minutes! Wall clock time just fundamentally has never been precise enough for leap seconds to solve a problem, by two orders of magnitude.


A 15 minute shift once every hundred years would wreak havoc on all of the software built when the shift was too far away to worry about.

Better to design in the requirement for all software to support these shifts, like a random 10 minute shift every day. That way nobody will write software that doesn’t support the shift.


Timezone aware code already has to process 15 minute timezones, IIRC; India is on a 30 minute timezone.

Some timezone changes are well published before the change and others are published after, and software has to pick up the pieces.

I'd imagine a 15 minute jump to recenter zones would be published with at least as much notice as changes to DST rules.


AFAICT from looking at the source for the tz database [0], it's capable of handling UTC offsets at least to a second level, if not fractional seconds.

[0] https://github.com/eggert/tz


ISO 8601 and other display standards for timezones only support one minute resolution.


UT1 is available from the IERS if you need it.


We can move timezone after 3000 years, when difference will be 1 hour and noticeable.


Yep. Timezone definitions already change regularly due to governments messing around with time; so software is already set up to handle this case.

Though I wonder what happens with the international dateline -- would it shift around the world, or would the UTC offsets get larger and larger? I imagine the latter would eventually also result in software issues when the UTC offset starts exceeding 24h.


There are moves to abolish DST and I suspect that after that happens timezones will ossify, which will make it difficult to use them to compensate for a large DUT1. And after 1000 years of gradually shifting daylight, people will probably be used to noon being a bit late and may well prefer it.


In a thousand years it is quiet possible there will be sizable enough population off of earth that trying to align time with the variable rotation of the earth will be seen as pointless Terracentrism


Maybe 1500 years and 30 min.


I've created a ticket in the backlog for this!


Maybe 750 years and 15 min.


Which leads to another interesting question: are we going to sync up UTC and TAI, or do we just have to live with a completely arbitrary several-second offset for all eternity?


Astronomers use "Julian Days" and "Julian Dates", but they are not at all based on the Julian calendar. They are just a linear count of days which is easier to plug in to algorithms. It is quite similar to the Unix Timestamp, and it is quite easy to convert between the two: jd =(unixT / 86400) + 2440587.5; [1].

But astronomers have to deal with the non-uniform, unpredictable rotation of the Earth in some way. Leap seconds only help a little, and predictions can only be made a few months into the future, so it is necessary to download the latest Earth Orientation data from the International Earth Rotation Service [2].

E.g. For the Apr 8 2024 solar eclipse, some popular camera control software packages had old predicted values for the Earth's rotation. Some due to the authors not updating them, some due to old installations that didn't update the data automatically. That caused people who didn't notice the discrepancy to miss their shots.

[1] https://celestialprogramming.com/julian.html

[2] https://www.iers.org/IERS/EN/Publications/Bulletins/bulletin...


> I am really into astronomy. But dealing with this, just so stars pass local meridian exactly at 00:00:00.000 is simply not worth it!

And of course it still doesn't actually happen, because almost all locations (i.e. with measure 1!) are not aligned to their time zones.


Doesn’t astronomy already use TAI, which has no leap seconds?


Astronomy is a big field. In my field it is either UTC or TDB.


AFAIK, astronomers use Julian DATES. It's related but not the same as the Julian calendar.


I honestly don’t understand the examples of why Russia, Iran, or China?

Is leapseconds going to be a US-friendly vs not thing?


At least in the case of Russia they voted against the removal of leap seconds.

https://arstechnica.com/science/2022/11/network-crashing-lea...


There was never a negative leap second, so most systems are not ready for it. It is not just a question of time metadata.

And cooperation could be a problem. I can not imagine Microsoft providing updates for illegal Windows 7 in North Korea, and NK government just taking them.


I don't understand why we should care if NK's systems are on time or not.


We should probably care that at least some of their systems share our definition of time.

International agreements have start times (and a couple of seconds could matter).

Physical things (any future international power supply for example) might depend on it too.

For better or worse, we have to be more charitable with NK than they are with us.


> International agreements have start times (and a couple of seconds could matter).

International agreement get violated all the time. good faithfulness is more important than those couple of seconds.


Electricity exchange grid needs to be phased within 60Hz, that is a delay of a few milliseconds. Baltic states hate Russia, but they are still connected this way for example.

Maybe you do not care, but this stuff is extremely important. Airplanes could crash over a few seconds difference!


> Electricity exchange grid needs to be phased within 60Hz

Electric grids also make use of DC interconnects that obliviate the need for that synchronization. The contiguous 48 US states have at least 3 separate grids that are interconnected. Roughly east/west/Texas. There may be more, but I'm out of touch with it.


Interesting.

Though I wasn't really thinking about phase sync when I was talking about time and cross-border electricity (indeed, hadn't occurred to me!)

Just scheduling, switching, co-ordination etc. It is likely to me that other international things, both physical and logical) rely at least as much on time being accurate to the second.


Looks like the Baltics are looking to synchronize with the rest of Europe by 2025 (and already stopped trading with Russia in 2022):

https://elering.ee/en/synchronization-continental-europe


> 60Hz

ITYM 50Hz




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: