Hacker News new | past | comments | ask | show | jobs | submit login
Hate leap seconds? Imagine a negative one (counting.substack.com)
156 points by rdpintqogeogsaa 13 days ago | hide | past | favorite | 108 comments





Since that was published, the earth has been slowing down gradually. I keep a casual eye on IERS Bulletin A, which has the latest earth rotation observations, predictions for the next year, and a simple formula for longer-term estimates. https://www.iers.org/IERS/EN/Publications/Bulletins/bulletin...

The line I look at is

  UT1-UTC = -0.1204 + 0.00020 (MJD - 59593) - (UT2-UT1)
which gives the current value of UT1-UTC (if -0.1204 grows to more than about +0.5 then we need a negative leap second) and the estimated long-term difference between earth’s rotation and 24x60x60 seconds (0.20 ms fast was up to 0.26 ms fast a few months ago).

Actually, I have some code that looks at Bulletin A for me, https://github.com/fanf2/bulletin-a which until recently was estimating that the next leap second would be a negative leap second in about 2028/9. But now its guess is receding into the future past 2030.

I have a few tweet threads on this topic, which you can find via https://twitter.com/fanf/status/1478767806187552776 with some clicky-clicky to dig the older ones out of twitter’s delightful user interface.

See also https://news.ycombinator.com/item?id=25145870 for some previous discussion here on this topic.

So nothing dramatic is likely to happen any time soon, though the current situation is quite weird.



I'm a little confused: if you look at the updated version of the same graph from the article (https://datacenter.iers.org/singlePlot.php?plotname=Bulletin...), it shows a continuation of the results of the speeding-up behavior, a positive slope on the UT1-UTC graph. Or am I misinterpreting things?

I was mostly talking about the slope of those charts: the line is still going upwards, but more slowly. The seasonal effects (the regular lumps) are large so it is hard to tell from the graphs what the long term trend is. My guesses are based on the IERS estimate for what happens after the detailed predictions in Bulletin A, and their charts don’t go that far into the future.

Ahhh, I see. So rotation is still speeding up, but that speeding-up is itself slowing, and your belief is that, before too long, it'll go back to the normal slowing-down state?

Have a look at the rate of earth rotation over the past 250 years where it is unclear that there is long term acceleration or deceleration. https://www.ucolick.org/~sla/leapsecs/dutc.html

If it was going the same amount "faster", the slope upwards would be the same. But it's lessened significantly, even if it is still currently heading towards a negative leap second being necessary.

A negative leap second would be a fun experiment. E.g. in Germany, the atomic radio clock signal DCF77 does have a bit to announce a positive leap second, but it has no way of announcing a negative leap second.

https://en.wikipedia.org/wiki/DCF77

Now, cheap DCF77 clocks (e.g., alarm clocks) don't care about the leap second bit anyway, but they will just adjust a few minutes later during normal resync. And they will do so just fine with a negative leap second. But some clocks, most probably those hidden in industrial systems, controllers, etc., may need to stay exact, and I am sure it's fun to find out which systems really care and will then fail and in what ways. :-)

Interestingly, the Japanese and the British atomic radio clock signals can do negative leap seconds:

https://en.wikipedia.org/wiki/JJY https://en.wikipedia.org/wiki/Time_from_NPL_(MSF)

For WWVB, whether negative leap seconds are supported seems to depend on whether amplitude or phase shift modulation is used.


DCF77 has a leap second warning bit. PTB says it is up to the user to know if it positive or negative. If a clock blindly assumes it is a positive leap that’s an error in the design of the clock. This is not ideal.

Likewise WWVB has a leap second warning bit. But it includes a UT1 sign bit so the user can know if the leap will be positive or negative. This design is fine and any WWVB clock or software can properly handle positive or negative leap seconds.

This is true for both the amplitude and the phase modulation formats.


It is a curious fact that UTC is specified by the ITU in recommendation TF.460 https://www.itu.int/rec/R-REC-TF.460-6-200202-I/en which also includes a specification for how radio time broadcasts should be formatted. I don’t know if any time signals actually use that format, tho.

That part of the ITU used to be CCIR, the international consultative committee for radio, which goes a little way to explaining it, but it is still strange that it isn’t the responsibility of the IAU, IERS, or SI.


Strange is what they said just after the CCIR incepted leap seconds at the SI subcommittee in charge of the subject: "The CCIR may have overstepped its remit in defining the UTC" and "The process that led to UTC may have been illogical, but it was effective"

Now we're going to have "lies programmers believe about time":

1. Seconds in one time system last the same amount of time as seconds in another time system

2. Seconds in the same time system last the same amount of time as each other

3. Time numbers are strictly increasing - the same time number can't happen again

4. [etc, kill me]


Just wait until we reach the proper space age and start traveling at speeds close to the speed of light. Taking into account the relativistic effects is going make more than a few coders switch carees to organic farming.

GPS satellite designers already have to account for clock skew due to relativistic effects. A nice little real world confirmation of the theory.

Luckily clock skew already happens and is handled more or less by the current systems. The difference between the system moving at relativistic speeds versus a crystal being slightly out of spec add up to the same result. Incorrect number of ticks for a given proper time interval.

That's part of the genius of Leslie Lamport, embracing the fact that your distributed clocks aren't consistent, so start looking at effects through distributed systems from a light cone perspective just like you do with relativity.


Clock skew is in error. Relativistic time dilation is not. Both clocks will be right and disagree. You can't escape answering the question: do I care about the time elapsed at home or the time elapsed here?

At the end of the day there's always skew; crystals in systems aren't made to perfect tolerances. In fact I'm not even sure how you would define perfect tolerances for clocks in a meaningful way.

So at the end of the day it doesn't matter if it's in 'error' or a fundamental to the physics even under perfect conditions. Neither are really under your control and both have the same answer.

> You can't escape answering the question: do I care about the time elapsed at home or the time elapsed here?

We already have that concept, in the difference between CLOCK_WALL and CLOCK_MONOTONIC. You use monotonic for "local temporal reference frame" and wall for "proxy of globally visible temporal reference frame". Wall shifts around anyway as NTP or what have you figures out how skewed off of "actual" time you are.


If you need time elapsed then you implicitly need a frame of reference in the same way that a clock time needs a time zone to be meaningful in a global basis. In practice if you wanted this it would mean having a timer colocated within your frame of reference as there is no scalar representation of a frame of reference that would allow you to do timestamp calculations. Timestamp math only works as a hack because we mostly stay relativistically close to Greenwich.

Astronomers use mathematical models of the solar system to translate between barycentric time, geocentric time, and terrestrial time, which are all different relativistic frames of reference. Only TT has actual clocks: TAI ticks at the same rate as TT, tho for historical reasons there is a fixed offset. That isn’t counting the clocks derided from observing the orbits of the moons and planets…

> You can't escape answering the question

Yes you can. When error swamps the difference between two things, you don't have to choose.

Do you set your analog alarm clock to wake you at 6:00:00 or 6:00:05?


Is that not choosing?

The answer I was implying is: You can't distinguish between them, you put the arm in the vague vicinity of both. So no it's not choosing. You could choose which to aim at if you really wanted to, but the end result would be basically the same. You can't choose which one it's set to, because it's so far below the noise floor.

Today it is. My point is if a ship goes on an 80 year journey at an appreciable speed then its wall clock will disagree by an appreciable amount. If we have hundreds or thousands of such ships and they communicate with each other, someone will need to choose what clock they care about and it won't be the Earth's clock.

Well I meant the real you/me. Not a hypothetical spaceman.

Anyone on Earth can just say their computer clocks are set to "Earth", ignore relativity, and use their normal clock skew correction algorithms.

It's not "You can't escape answering the question". It's "Interplanetary spacemen can't escape answering the question." And those spacemen might not even use seconds.


We already define time relative to 1970-01-01, relative to the mean solar noon in Greenwich.

We should also define the length of a second to mean "at sea level in Greenwich".


This just means the "length of a second" wanders around randomly and is thus useless. So now "a second" isn't any particular amount of time, and you've lost all ability to really measure time except in a vague inexact sense.

The quest to eliminate irreproducibility from metrication is why SI did all that work to finally eliminate the kilogram prototype. The second was defined reproducibly decades ago because trying to define it based on our planet's inconstant spin was such a failure.

The fact that the Earth's spin is not constant is perhaps frustrating, but like other facts just insisting it isn't true doesn't help you.


Right, but general relativity matters for how astronomers define their timescales, and how they are measured and maintained in practice: there are different scales for barycentric time (at the centre of gravity of the solar system), geocentric time (centre of the earth) and terrestrial time (on the surface of the rotating geoid). TAI is supposed to tick at the same rate as TT, so it depends on a model of an equal-gravitation version of sea level (the geoid) and the relative height of the time labs. In particular, NIST is a mile high in Colorado so their clocks tick a lot faster than (say) NPL by the Thames in west London.

Sure, the quest for Universal Time is despite the name necessarily a local quest so you can't meaningfully export your results to creatures orbiting a distant quasar as you can with the SI units.

But the size of the discrepancy between Colorado and London is far smaller than the discrepancy Leap Seconds are concerned with right?


Yeah, gravitational time dilation is about 1e-16 seconds per metre of elevation near the surface http://leapsecond.com/great2016a/index.htm so Boulder is about 1e-13 fast. The rotation of the earth is stable to about 1e-9 http://leapsecond.com/ten/ which is also about the same as the rate difference between UT1 and UTC (0.2 ms / 86400).

During the discussions leading to the inception of the SI second the bigwig standards folks nixed the consideration of relativity, basically paraphrased as "If you physicists do not stop arguing about relativity we will never redefine time in a way that removes it from the astronomers."

> This just means the "length of a second" wanders around randomly and is thus useless

Why would it? My redefinition just means the length of a second is defined as a given number of cycles of caesium in a gravitational field equivalent to that of Greenwich.

If you're in a different gravitational field, you'll just have to convert the values easily.


These days, the length of a second is defined by the difference in frequency between two ground-state energy levels of a caesium-133 atom, which works out to 9192631770 cycles per second. No need to check tide tables.

That definition is still relative. It still requires as additional factor the place where the measurement is taken, due to Einstein's theories.

That's why I proposed changing the second definition to 9192631770 cycles in a gravitational field equivalent to that of Greenwich.


Sea level when the tide is in, or out?

“sea level” is short for “mean sea level”

Not trying to troll, but... What happens if/when global warming makes sea level rise? Do we have measurements that are still based on "sea level"?

https://tidesandcurrents.noaa.gov/sltrends/faq.html

> Tidal datums must be updated at least every 20-25 years due to global sea level rise. Some stations are more frequently updated due to high relative sea level trends.


I mean, I feel like that's fairly obvious, and the playful pedantry of my comment in a discussion on the very definition of basic unit of measurement was overlooked.

If we are going so far as to rigidly include "sea level" in the definition of "a second", it would probably help if we say that that we mean mean sea level.

Know what I mean?


Those corrections (both special and general relativistic, with opposite signs) are already required to allow GPS to function.

I'd like to think my descendants will be beet farmers in a far away solar system.

We do: all these are taken into account for satellites. In particular GPS is one of the prime examples as it is affected by both special and general relativity and how fast time elapses is critical to its function.

Probably takes some serious skills to be an organic farmer on Ceres.

Conveniently it turns out that the results are the same independent of your skill.

It does, but you make more money than you would using standard farming methods and pesticides on Ceres.


A collection of several falsehoods programmers believe lists, not just time:

https://github.com/kdeldycke/awesome-falsehood


Why wouldn’t we decouple time from the earths orbit?

Instead have 2 calendars: the normal date time is standardised, no leap anything and simply increases. The orbital time tracks the earths revolution around the sun.

Sure, eventually the two will drift. But leap days account for 3 weeks-ish of time for the average person through out their life (let’s assume 80 years life span for arguments sake). So this doesn’t seem so hard to adjust to.


Earth clock time and calendars arose specifically to measure time as concerns earth-related experiences --- events within a day, throughout the year, and across multiple years.

Measurement systems ultimately serve the needs of those who are doing the measuring. This fact alone accounts for many of the idiosyncratic aspects of traditional measurement systems. An acre-foot measurement (a volume of water) makes sense when you've got a pond whose surface area (in acres) is known, and whose depth (in feet) can be measured against a post or dam face. An acre itself is the amount of land a farmer can plough in a day (in German, Tagwerke), and would vary by terrain, farmer, plough, draught animals, and hitches.

Time systems suitable for other needs may come to be used. So long as a substantial human population remains on Earth, an Earth-centric time system will all but certainly exist, and adapt to changes in the Earth-Moon-Sun system's orbital and rvolutionary characteristics.


When I first read this, my intial reaction was "LOL what a silly... of course the simple reason is..." ... and then it turns out, I couldn't actually explain it to myself. I had a couple of reasons lined up, but none of them passed muster when I broke it down.

I suspect the real reason my be overlapping domains of varying precision/accuracy, which tolerate different degrees of error on any specific time measurement, but still need to agree on a shared center of the distribution of that error.

So in our daily lives, we might tolerate a fraction of a second of drift without notice... But there are some specialty domains that do require more precision, and we need the time measurements in both domains to mostly line up, to avoid confusion. So our daily clock needs to be slaved to the more precise clock, even if we don't require that precision in most things.


It's not obvious to me that there would be a loss of precision. If you are a farmer, you check the solar calendar and plan activities based on that. For your daily work, you still have the normal 24hour day.

But we do all sort of dumb time related things: - day light savings - leap seconds, days, years

Some places too close to the poles don't have normal day/night cycles of light. So we already tolerate a lot of weird time stuff. This way we keep tolerating it and get rid of the calculations.


Wait... Why is all that timekeeping stuff "dumb"? Because you don't see a reason for it, and you can't imagine that you might not yet know the reason why? That's just plain egotistical, man.

https://wiki.lesswrong.com/wiki/Chesterton%27s_Fence

You need to learn the actual reasons why precision synchronous time exists. Until you do that, you're just wandering around in the dark, being silly.


Look at how bad people are at dealing with timezones. Having two similar but subtly different overloadings of the same terms is a recipe for confusion (having very distinct terms would actually be an improvement).

Leap seconds are so much simpler than trying to wrangle two calendars with built in drift

I'm not sure how you figure that.

In my system, the time is simply a number that increments. No calculations are ever needed to correct anything. This is what most people use to schedule things.

The second "clock" tracks position of the earth around the sun. You need it for the seasons. But this too is always exact. There will never be a situation where we need to adjust it. You can never reach the end of a year and be slightly short on the distance traveled around the sun. So one day Xmas is slightly warming than usual. But so what?


Every person using their calendar is solving one simple problem in a simple way.

But the real magic of the calendar is that billions of individuals use the same calendar for the same problem and thus we can all agree what day it is, every single day.

So if there's an issue with the existing calendar, then simply adjusting our calendar by an amount really no-one will notice is by far the most practical solution. Almost every industry cycles with the season; we need the calendar to track it accurately.


I have not broken any of that though.

The article is all about trying to get 2 things to sync because of the slowing down of the earths rotation. You haven't actually described why we need to go to this effort beyond maintaining the current system. Nor why this is a practical thing to do in general beyond maintaining the current system.

Loads of people accept weird times shifts all the time. The poles have 6 month days and nights. In the summer I get 3 extra hours of day light and treat 7pm as "day". We already adjust what we do by what it looks like outside. Why does the time need to "match"?


Everything hinges on our rotation around the sun - at the most fundamental level: seasons, crops, construction schedules, even military campaigns. From there on up we get holidays, sport seasons, business cycles. These things have to match up to the calendar from one year to the next or there will be chaos.

If our calendar is off by a second or two a year, it's very easy to correct without confusing the general populace - most people want to spend precisely zero time thinking about calendars and orbits. Currently, only a few nerds have to worry about leap seconds and the like. And it's part of their jobs, so it's OK.

In a two-calendar system, every human will have to keep track of a personal calendar and a scientific calendar. Asking people who don't care to do that - and to fix an issue we already have an almost zero-friction fix for - is a non-starter.


With your calendar, wouldn't the drift eventually be so bad that the month we currently call January might be your calendar's October? When people read old stories about the fall leaves of Halloween people would think it was weird because there'd normally be snow on the ground in October, or the dead heat of summer in the southern hemisphere. And what's this about a "white Christmas?" in Spring?

Keeping the time in sync with the orbit of the earth isn't just about the "here and now" but also history.


Here is one of many. Several are also shown in this thread: https://gist.github.com/timvisee/fcda9bbdff88d45cc9061606b4b...

I took leap seconds for granted, then I read about Google "smearing" leap seconds. And then (on HN) I read about someone else working on a standard for smearing them differently than Google does, just because.

...aaaand there's an xkcd:

https://xkcd.com/2266/


Applying a negative second to a fleet is a solved problem, using the "leap smear" technique by Google. https://developers.google.com/time/smear

Applying it to millions of individual servers with different OSes? I'll make the popcorn.


Leap smear is itself an overwhelmingly bigger problem than a leap second.

A leap second is over in a second. With leap smear you are out of sync with the rest of the world for many hours. And, there are at least three different leap smear schemes I know of: a massive judgment failure that happens again and again.

The sensible alternative would be to eliminate leap seconds entirely for a few decades, and maybe stage a leap minute, eventually, or a leap hour in some far-off century, or never. It doesn't matter if the sun is ever exactly overhead at noon, because it isn't anyway, almost everywhere.


It's not particularly interesting to be out of sync with the rest of the world, you always are and messages to/from external entities are even more so constantly. When it is interesting is when the thing you are doing is a function of time e.g. GPS or locked-step systems spanning geographic regions or so on. In almost all of those cases you're probably well used to dealing with time problems that are a bigger pain than a leap second. For those that aren't hiding the leap second and being off by slightly more than usual for a bit is probably better than hoping everyone can be prepared to suddenly care a lot about handling time oddities for one second every year (or century as you propose).

"Or never. Is never good for you?"

Only a positive leap second is “over in a second”. A negative leap second is essentially “over before it starts”. This asymmetry is an opportunity for code to implement UTC incorrectly.

Another issue is that a positive leap second cannot be represented using time_t-like representations. But at least every time_t value maps to a UTC time.

By contrast a negative leap second, should one occur, will cause a permanent illegal time_t value that has no existence in UTC. Checking for that illegal value in every piece of code that uses time_t values will not be fun.


> By contrast a negative leap second, should one occur, will cause a permanent illegal time_t value that has no existence in UTC. Checking for that illegal value in every piece of code that uses time_t values will not be fun.

Positive leap seconds, when translated into a dumb format and back, can be off by a second and won't round trip properly.

Negative leap seconds, when translated out of a dumb format and back, can be off by a second and won't round trip properly.

Seems pretty close to me.


> And, there are at least three different leap smear schemes I know of: a massive judgment failure that happens again and again.

There's also the exciting version where some of your time servers do it one way, and some do another; you could probably pick two traditional servers, and then one of each of the others and have a really terrible day. From experience with a setup where 4 servers were traditional and one was smeared, there was a lot of variation in time; but thankfully my systems weren't relying on time measured on different servers to be very consistent.

IMHO, leap smear probably should have been done with standard time on the wire, and the host agent doing the smear. But that would be more effort than doing it on the time servers, so there you go.


If I've learned one thing about dealing with exceptional events, it's that the less often they happen, the less likely you are to handle them well. Switching from leap seconds every few years to leap minutes every century, pretty much guarantees we'll screw it up. But maybe that's ok? Y2K was a fat nothingburger, after all the fuss.

It's a little like a public health success: if done right, the worst that happens is that people have to prepare, and people complain about it, but disasters are avoided.

Pre-Y2K there was a lot of fuss and hand-wringing, but there was also a lot of software development that happened to ensure that Y2K would not be a disaster. So then when Y2K rolled around the preparations mostly worked.

My first programming job in high school was over-hauling a (mainframe-hosted) code base for handling lab data at a hospital. This was in the mid-90's; before most of the Y2K fuss had started. Without my work, pee and poo samples would be time traveling from the late Victorian era. You're welcome.


Y2K was a fat nothingburger, because of all the fuss. Millions of person-years and $300 billion were spent to avert disaster - and it worked!

But then all anyone ever sees is that nothing went wrong, and they assume it was nothing to begin with.


That's why we should eliminate leap seconds/minutes/hours and just change time zone definitions to keep the sun overhead near noon.

Is the concept that UTC would become "detached" from a particular location on Earth and the offset of time zones relative to UTC would change over time (very infrequently since it would take on the order of a millenium for a time zone to shift by half an hour).

Exactly.

Push it out to a century, and then hope we have the ability to slightly accelerate the planet by then?

> A leap second is over in a second.

The problem isn't one where you hold your breath and squeeze your eyes shut for a second while the unpleasantness passes. The problem is when you have tons of code running and you don't know exactly how it'll deal with this very rare event.

If you're faced with an upcoming leap second there are various ways of dealing with it. One way is to do an audit of the code and hope that you can verify the most important parts. Another way is to smear and hope that most programs won't be negatively affected by reported seconds being 0.00001s off.

There are other options, but I don't think there's an "obviously correct way".


> Another way is to smear and hope that most programs won't be negatively affected by reported seconds being 0.00001s off.

> There are other options, but I don't think there's an "obviously correct way".

Well, there are ways that following reality more and less. There are also ways that have legal traceability to time sources in some regulated industries.


> a leap hour in some far-off century

Just change your time zone offsets at that point, honestly.


> Applying it to millions of individual servers with different OSes? I'll make the popcorn.

The FreeBSD folks test for this:

* https://lists.freebsd.org/pipermail/freebsd-stable/2020-Nove...

The kernel and ntpd are fine with it. Applications: ¯\_(ツ)_/¯


> I'll make the popcorn.

Too late, your IoT-enabled microwave just crashed and can't heat up the kernels


I don't understand why do we need to have leap seconds. Even if time on Earth moves few minutes back, nobody would notice but astronomers. If time moves one hour back, just change your local zone offset, it happens all the time anyway, and that's about it.

It makes time calculations unreasonably complex without any benefit.


Pick any two:

1. 86400 seconds a day

2. SI seconds

3. Noon synced up with the sun

With UTC, we drop 1.

With UT1, we drop 2.

With TAI, we drop 3.

Maybe we should have gone with TAI, or with UT1 - I think that's what Julia does, sort of:

https://docs.julialang.org/en/v1/stdlib/Dates/#footnote-1


Using TAI was also djb's solution to the problem. https://cr.yp.to/proto/utctai.html

Yep, just use TAI, you're already converting time to show it locally, so no problem also including leap seconds in that conversion

> I don't understand why do we need to have leap seconds. Even if time on Earth moves few minutes back, nobody would notice but astronomers.

Right, and nobody's stopping you from using a sundial instead of a GPS watch if you don't care about high precision timekeeping. And if you do care about high precision timekeeping but hate leap seconds, you can use International Atomic Time (TAI). It all comes down to standardization. You might enjoy reading this:

https://www.nist.gov/pml/time-and-frequency-division/nist-ti...

Basically, standardization is hard. Not just for time, but any measurement (weight, length, etc). How do you absolutely guarantee that an airplane part made in France and the same airplane part made in Brazil are exactly the same dimensions/weight? You need standards. And those standards have slightly changed over time in an effort to achieve higher consistency/future-proofing.

Time is especially difficult because gravity and speed affect time thanks to relativity. So how can you come up with a universally agreed upon definition of "one second"? The current best solution we have is to take a weighted average of 300+ atomic clocks. That's how we define TAI, and by extension, the second. However, people expect noon to be "the time when the sun is at its highest point in the sky". Leap seconds are used to make non-TAI time systems align exactly with the rotation of the earth. Without leap seconds, your local day will slowly drift until (millennia later) "noon" is happening in the middle of the night. And without leap years (which align the planet's orbit around the sun) your seasons will slowly drift for the same reason.


> people expect noon to be "the time when the sun is at its highest point in the sky"

A) No they don't, which makes sense because

B) No it isn't

And "millennia later" is really overstating how soon this would happen. If you did this, every few millennia if people are very angry about it they would merely need to change their time zone definition one step. Now, how many millennia does it take to change time zone definitions where you live? Oh that's right they weren't even invented a thousand years ago.

So, firstly, people do not actually expect the sun to be "at the highest point in the sky" at 1200 local time, it hasn't been in their lived experience so why would they? They'd be surprised if it happened at 0800 or 1600, but precisely 1200 doesn't matter to them. Which is good because it isn't, it hardly could be.

Nobody was proposing to eliminate leap years (although perhaps renaming them "leap days" and marking them as a holiday might be better) which are a whole different problem.


And leap seconds will stop working (as currently specified) in one or two thousand years when the earth is slow enough that we need ~12 each year. Time zone adjustments will continue to work much longer. https://www.ucolick.org/~sla/leapsecs/dutc.html

There is a confounding issue, though: the continual messing around with daylight saving means that time zones have to be relatively easy to alter. But if (as seems likely) daylight saving is abolished, this flexibility will ossify, so it might not be so easy to use in the distant future…


If we don't apply corrections to UTC to match earth's rotation, their difference will increase roughly quadratically. Here is a table with estimates of when which delta (DUTC) will be reached: https://www.ucolick.org/~sla/leapsecs/dutc.html#dutctable

> If time moves one hour back, just change your local zone offset

...so you want to replace leap seconds with leap hours?


Calculate how often leap hours will happen.

So, yes.

Time zone changes occasionally happen already.


You're complaining about the difficulty of special-case code handling, and your solution is make an even specialer case which will happen so infrequently that no one will handle it until it does happen.

1. When we're talking about hundreds of years, even if it was that special it would cause less total disruption.

2. It's a time zone change, and those already happen more often than leap seconds.


> It's a time zone change, and those already happen more often than leap seconds.

I don't know about you, but I never change the time zone on my servers. Ever.


I'm living in a place where time zone offset/rules were changed 3 times in the last 30 years, because politics can't make their mind what do they really want.

Does the offset of your server's time zone also never change?

Either way, these code paths get tested relatively often.


I’m fine with punting on problems that won’t have to be solved for literally thousands of years.

This is a fascinating read.

If the Earth speeds up enough, we might find ourselves pondering over the possibility of a negative leap second. According to the Time and Date folks [0], a day in 2021 is averaging about 0.2ms faster than the 84600 atomic seconds per day, ~70ms/year, so at most 14 years of this would put us over the threshold (super unlikely). In reality, we don’t have to speed up a full 1000ms of rotation speed because there was always a fractional difference in UT1-UTC.

https://www.timeanddate.com/time/earth-faster-rotation.html


> It’s important to know that there are exactly 86400 (246060) UT1 seconds in a UT1 day. If the rotation of the Earth speeds up/slows down, the definition of a UT1 second changes since the number of seconds in a day is constant.

I recently took an introductory Latin course and learned that ancient Roman time operated on the same principle. As the days got longer in the summer, so did the absolute length of time encompassed by an hour. Hard to make a sundial to automatically compensate, I imagine.


A sundial measures the angle between a point on the surface of the earth, the centre of rotation of the earth, and the sun. Because the earth rotates at a constant-ish speed, a sundial naturally shows equal hours. It is possible, but tricky, to make an unequal-hours sundial:

https://www.cl.cam.ac.uk/~fhk1/Sundials/Newnham/write-up.pdf

https://www.cl.cam.ac.uk/~fhk1/Sundials/Newnham/article01.pd...

https://www.cl.cam.ac.uk/~fhk1/Sundials/Newnham/article02.pd...

https://www.cl.cam.ac.uk/~fhk1/Sundials/Selwyn/Selwyn.pdf


I don't remember the details but it's actually just as easy. They're called unequal hours, Babylonian hours or Italic hours.

More precisely the difference between two sunrises or two sunsets was 24 hours, so the hours were longer in winter and spring as days get longer, and shorter in summer and fall as days get shorter.


Why are we adjusting when after 100 years clock would be less than 1 min off. We should deprecate UTC and just use TAI, ~40s difference right now.

If we don't correct time-keeping to correspond to earth's rotation, their difference will increase roughly quadratically. Here is a table with estimates of when which delta (DUTC) will be reached: https://www.ucolick.org/~sla/leapsecs/dutc.html#dutctable

I found this section very interesting:

> TAI is a statistical timescale which is produced at the BIPM by combining the reports from many atomic clocks around the world. Anyone who wants to find out what TAI it is now must mark the time with some local clock that contributes to TAI. Then some weeks after now when all clocks have been combined the difference between the local clock and TAI will be known. At that point it can be said what TAI it was at the moment of interest.

> There are several interesting features here. First of all, before 1977 the clocks contributing to TAI were not corrected for the gravitational redshift. Because most clocks are above sea level they tick faster than TAI should tick. Through the mid-1980s there is an annual wobble due to seasonal environmental changes at some of the clock sites. In 1995 a CCTF working group deemed that the clock frequencies should be corrected for thermal radiation, and the CIPM affirmed this in 1997. The steering of TAI to the corrected frequency occurred over three years from 1995 to 1998, and the final levelling of the curve over those years indicates that the frequency of TAI is now consistent with cesium atoms at 0 Kelvin.

Basically even with TAI there are all sorts of time deviations as an artifact of history.


Is it possible to configure my computer to run on TAI and then use the timezone to say that I'm TAI+10:00:37. When a leap second happens, treat it as a DST transition.


Martin Kleppmann, author of "Designing Data Intensive Applications" talks about this in his excellent video series (skip to 14:30 for negative leap seconds etc) https://www.youtube.com/watch?v=FQ_2N3AQu0M&list=PLeKd45zvjc...

I'm actually fascinated by the climate change angle here. While it's already clear that our inability to tackle the climate crisis is having a physical, cultural and mental effect on our way of life .. a temporal one is something I don't think a lot of people have considered.

FYI you can find an extensive discussion of this article on Reddit: https://www.reddit.com/r/programming/comments/s2mij2/

I'm not sure why people are creating themselves problem ? UTC to be defined as the norm as it's the most stable, and those who wants to use UT1 for space / solar-related usage can use it and drift-away from UTC and that's it ?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: