Hacker News new | past | comments | ask | show | jobs | submit login
The leap second’s time is up: world votes to stop pausing clocks (nature.com)
684 points by tinalumfoil on Nov 18, 2022 | hide | past | favorite | 439 comments



In 2015 I was working at a "fintech" company and a leap second was announced. It was scheduled for a Wednesday, unlike all others before which had happened on the weekend, when markets were closed.

When the previous leap second was applied, a bunch of our Linux servers had kernel panics for some reason, so needless to say everyone was really concerned about a leap second happening during trading hours.

So I was assigned to make sure nothing bad would happen. I spent a month in the lab, simulating the leap second by fast forwarding clocks for all our different applications, testing different NTP implementations (I like chrony, for what it's worth). I had heaps of meetings with our partners trying to figure out what their plans were (they had none), and test what would happen if their clocks went backwards. I had to learn about how to install the leap seconds file into a bunch of software I never even knew existed, write various recovery scripts, and at one point was knee-deep in ntpd and Solaris kernel code.

After all that, the day before it was scheduled, the whole trading world agreed to halt the markets for 15 minutes before/after the leap second, so all my work was for nothing. I'm not sure what the moral is here, if there is one.


Reminds me of the story of the computer engineer at Data General in Traccy Kidder's nonficion book, "The Soul of a New Machine" [0], who quit after spending weeks toiling away on sub-second timing concerns:

> He went away from the basement and left this note on his terminal: "I'm going to a commune in Vermont and will deal with no unit of time shorter than a season."

[0] https://en.m.wikipedia.org/wiki/The_Soul_of_a_New_Machine


What should I say, trying to go sub millisecond, one profiling run at a time, sigh...


Sub millisecond timing is basically impossible with context switching involved. Even on a real time kernel config, the best you are going to get with a time slice is 1ms.

With busy waiting, you can achieve timings in the sub-microsecond range, depending on how heavy the loop is.


Not sure what you mean by that but nanosecond precise timing across cpu cores have been reliable for quite a while now.

Key words: constant TSC, invariant TSC, TSC clocksource


The hardware providing those nanoseconds is not nanosecond accurate (unless using atomic clock)

However they may be more accurate than 1ms. The guy is saying the timeslice given by the OS for a program to run in has at best a 1ms slot because the OS is switching between threads on a 1ms timeslice basis

So unless you're polling, the timing at which you ask the hardware for nanoseconds will jitter with 1ms offsets


There are a lot of applications in the world that don't run on a regular processor under regular linux

The guy is overgeneralizing IMO. You can delve deep within the sub millisecond world even as a regular dude with a regular computer and a regular OS by just doing everything in an interrupt context


No, TSC on modern CPUs is much more granular that that. No atomic clocks needed, just a normal quartz crystal. This is how ptp works and you can definitely get sub nanosecond accuracy from it. Wrt scheduling quantum, this is entirely configurable and subject to scheduling policy, priorities and additional mechanics such as isolcpus and nohz. GP's comment is just plain wrong.


Maybe he should get Stephen Colbert's second-by-second day planner.


woah, do you have a link on that? I use plan [1] which is great for minute level planning but also annoying in various ways, if there's other software that can do similar, would love to try it

(i'm guessing that this was purely a joke and no such thing exists)

[1] https://help.supermemo.org/wiki/Plan


It was from a skit on the Colbert Report (can't remember the episode). He talks about how NASA added a leap second and then pulled out this comical "second-by-second" year planner and said his plans are ruined because he doesn't know what to do with the extra second. Wish I could remember it.


I definitely would like to find this episode. Not coming up on Google. Do you have any idea when it was broadcast? Thank you so much!


It's dumb you got downvoted for telling a joke, geez


Telling jokes on HN is risky. HN readers don't want the site full of jokes, so they savagely downvote them unless almost everyone finds them hilarious.


> geez

This is a euphemism. Please don't use it.


Why not, out of curiosity?


It's a euphemism for J-sus.

“In this is love, not that we loved God but that he loved us and sent his Son to be the expiation for our sins.” (1 Jn. 4:10)


Dissent, because for those of us who have never heard of such a connection (myself included) it is /not/ a euphamism but rather a safe epithet that the use of which never got anyone accused of "foul langauge" nor got anyone threatened with "mouth washed out with soap".


. o O ( The minor peril of disabling my autocorrect is that while my words may be inteligible, and both non sequiturs and wholly altered meanings may be avoided…spellings may occasionally falter. )


This is a good story regardless, but if you do want to derive some morals from the experience:

– Seemingly simple tasks can be more complex than you expect (“add a leap second on this Wednesday”)

– Real world systems can be more complex than you expect (“bunch of software I never even knew existed”)

– Planning and testing can make a big difference vs. just winging it (“a bunch of our Linux servers had kernel panics for some reason”)

– Success can be a non-event that goes unnoticed (”everything worked and no money went missing”)

– Sometimes the best solution is not a technical solution (“halt the markets for 15 minutes before/after”)


>Sometimes the best solution is not a technical solution (“halt the markets for 15 minutes before/after”)

We've had an election recently, right on the day when DST changed. On the night of counting of the votes, the clock went 2:59 AM -> 2:00 AM.

To save themselves trouble the Statistics Office instructed all vote counters that under no circumstances are they to enter or update anything in any system during the repeating hour until it's 3:00 AM the second time…


The interesting thing about DST is that it's not really repeating the hour, if you include the time zone offset in your time stamp.

Here, look. Using the time zone for Norway in this example, with the `date` command on macOS.

First the last second before DST ended in Norway this year.

    TZ=Europe/Oslo date -I seconds -jf %s 1667091599

    2022-10-30T02:59:59+02:00
Then the second after.

    TZ=Europe/Oslo date -I seconds -jf %s 1667091600

    2022-10-30T02:00:00+01:00
So while people say that time went from 02:59:59 to 02:00:00, I see it as time going from 02:59:59+02:00 to 02:00:00+01:00 :)


Yes, but not every logger, db, etc is going to include the timezone. What they did was the right call for the situation.


Perhaps another moral to add: ISO8601 solves a lot of problems.


This is a misconception. ISO8601 has six versions, is 14 pages long, and defines way to many ways to represent dates and times.

Everyone should follow RFC 3339.

https://www.rfc-editor.org/rfc/rfc3339#section-5.6


Thanks! Wasn't aware of that spec.


But they‘re mostly trivial variations of each other, or durations that nobody uses at all.


That's the problem though. If you don't support the durations that nobody uses, then you can't fully support ISO8601 dates.

Every trivial variation is another if/switch statement.


Durations are not used in places that need a timestamp, and visa versa. More applications need timestamps than need durations.


The problem is that nobody does it, for example no DB (AFAIK) does it, they all store dates as something like a Unix timestamp.

And for a lot of things problems would be solved with +01:00 offset while for others you actually want Europe/Berlin offsets


Europe/Berlin is NOT an offset, it's the zone. A proper date needs BOTH the offset AND the zone ! How come people don't understand timezone when it's right there, in the etymology of the word !?

Sorry not personal but I find myself explaining that each time the subject comes up at work (and generally to the same people... sigh)


Postgres supports timezones. In general Postgres is excellent for correctness at the cost of performance.


unix timestemp is unaffected by DST


Postgres and Oracle has TIMESTAMP WITH TIME ZONE. MSSQL has DateTimeOffset.


Depends on implementation. Agree that epoch makes the most sense.


> Sometimes the best solution is not a technical solution

I once came across an early 1950s Scientific American article by Bertrand Russel, IIRC. It included a cartoon.

Frame one: Computer beats man at chess.

Frame two: Man unplugs computer.


How about this? Frame one: man beats woman at chess. Frame two: woman shoots man with automatic pistol. Not sure how deep the original really is if you think about it...


The problem is you are trying to make something "deep" when it wasn't trying to be


IIRC, Russel's thesis was something to the effect of ultimate supremecy of man over machine. I associate "Sept 1952" as the issue. The last frame suggested a certain obviousness and nonchalance in the man unplugging the computer, as if no great debate would be involved. I wonder if the article itself might now evidence too much of innocence back in that day, aside from a prediction that computers would eventually beat humans at chess. Too much innocence regarding technological imperitives and the technosphere?


What?! That's not comparable at all. The computer isn't murdered, it's just off. Also even if you did destroy the computer, that's not murder.


It's not comparable because computers are things but women are not.


My intention was for the woman to shoot the man, it's possible I wrote it wrong. They could be both men or aliens or even computers/robots with human like intelligence. If they were both thinking computers the "pistol" could really be a human paid in Monero to go smash the other computers hard drives and backups.


– Success can be a non-event that goes unnoticed (”everything worked and no money went missing”)

And yet, there are still Y2K deniers (to be fair some people have exaggerated it to the point that they're promoting it as the end of the world).


> Sometimes the best solution is not a technical solution (“halt the markets for 15 minutes before/after”)

I'm little confused. How does this solve the problem? If you don't code for the second, you'll still be off if you wait. I'm I missing something?


The code to keep your clock in sync is "easy".

They had done it all the previous leap second during the weekend.

The hard part that required a lot of work was making sure that nothing breaks when it happens.

In this case they chose to have a mini weekend in the middle of the week for convenience


Sometimes it pays to be the most-prepared among your cohort. In this case, it would have paid so well that your cohort decided to work around it.

It always pays to not be the least-prepared among your cohort. You'll get no sympathy if you're at the back of the pack, you'll just die.


Another moral of the story could be that sometimes it's best to have a people solution to a technical problem.


Sometimes yes, the best answer to "doctor, it hurts when I do this" is actually "then don't do that".


There must have been discussions earlier about the market freeze. Finding/starting those would have been the correct approach, with a technical solution as a backup.


You got paid to dig extremely deeply into a very complex and important problem spanning multiple systems and domains. You developed a plan, tested it and were ready to act.

This is a hugely valuable learning experience few people even get a chance at, let alone solve. Too bad it doesn’t show up on your resume is the only downside!


Résumé, no.

Interview discussion? If you're any good at interviewing, it should.


The moral is we get to hear your cool war story. Thanks for sharing!

...okay yeah that's not a moral, but still.


Great, here's another.

$work had thousands of full custom, dsp-heavy, location measurement hardware devices widely deployed in the field for UTDOA locating cell phones. It used GPS for time reference -- if you know your location, you can get GPS time accurate around the 10's of nanoseconds. GPS also broadcasts a periodic almanac which includes leap second offsets: if you wanted to apply the offset to GPS you could derive UTC. Anyway there were three models of these units, each with an off-the-shelf GPS chip from one of three location vendors you've probably heard of. The chip firmware was responsible for handling leaps.

One day, a leap second arrived from the heavens. We learned the three vendors all exhibited different behaviors! Some chips handled the leap fine. Some ignored it. Some just crashed, chip offline, no bueno, adios. And some went into a state that gave wildly wrong answers. After a flurry of log pulling, debugging, console cabling, and truck rolls, we had a procedure to identify units in bad states and reset them without too many getting bricked.

It seems the less likely an event is to occur, the less likely your vendor put work into handling it.


It seems the less likely an event is to occur, the less likely your vendor put work into handling it.

This recalls perhaps the biggest mistake in the GPS specification, the 1024-week rollover period. A timespan long enough to be impractical to test without expensive simulator hardware, short enough to be virtually guaranteed to cause problems in long-term installations... and long enough for OEMs to ignore with impunity. ("Eh, it's 20 years, by that time I'll be retired/dead/working somewhere else.")

Moral: timescale rollovers need to be designed to happen very frequently -- as in a few days at most -- or not at all. Unfortunately the leap second implementers didn't have that option.


Somebody on LinkedIn working in data science opined that they should do away with DST. I commented that yeah maybe they ought to, and then bring it back in 5 years, rinse / lather / repeat as a stress test. Got a number of likes.


... what if we switched to leap milliseconds?


The moral of the story is that laziness is a virtue. Think of all the time that could have been saved, had you had no plans like your partners ;)


Think of all the time that could be better appropriated than on fintech in general. Seems like such a waste of resources siccing a bunch of computers against each other in a zero sum game of stock arbitrage. I will admit some of the stuff tech comes out of it is cool on its own at least.


Yeah, they could have been working on something truly valuable like violating people’s privacy with ad-tech. Or maybe sucking millions of hours of people’s lives away with TikTok algo improvements. Maybe they could be working on the next MoviePass!

> zero sum game of stock arbitrage.

By your definition insurance is zero sum as well. But people find that generally useful. Taking risk off of peoples hands has value even if a widget doesn’t come out the other end.


> Yeah, they could have been working on something truly valuable like....

But why not just a reasonable product? Yes, what could that be, there cannot be something that is fairly priced and people really want, need, what just helps. Not today anymore!

> By your definition insurance

I know what you try there and on a very abstract level you are maybe slightly right, but PLEASE no, high level gambling (==milliseconds&millions) is not at all comparable to the insurance model in many regards, foremost maybe purpose for the community?

(and its pretty clear that fintech here sure doesn't mean the classical bank and modern "normal" payment system... ).


> high level gambling

This is a weasel phrase. Insurance is high level gambling as well. They both use risk models to decide at what price they will sit on the opposite side of a trade for.

Decide where you really draw the line and if it’s time horizons that’s pretty arbitrarily stupid. Insurance uses re-insurance quite quickly to de-risk their own books so pretending trading within a minute vs daily/hourly is pretty silly.

Also, most “fintech” is not the “be the fasted to arb 2 exchanges” variety. It’s usually “be a market maker for products that you can relatively quickly price based on proprietary signals”. If speed is your only advantage, the cat and mouse game will wake you up one day getting beat to the order book on every trade.

> and its pretty clear that fintech here sure doesn't mean the classical bank and modern "normal" payment system...

Not sure where you got that from. All of this is interconnected. The “classical banks” are all participating in deep fast moving bond markets. JPM doesn’t send their orders to some old timey trader with a cigar and a bowler hat via telephone. They use fintech like the rest of the industry.


You didn't want to understand me. Sure both use risk models, but exactly this still doesn't make them anyhow relatable? And sure I know still that everything is interconnected, but it is clear that if someone complains about fintech (but other arguments then immediately go into the "oh you say fintech, but what about .."), he doesn't mean the necessary stuff like payments processing or normal markets. anyway fine to disagree here ;)


No, I’m saying your hand-wavy “fintech is bad” doesn’t actually point to any thing in particular. Most of HFT is not “do the same thing as the other players, but faster”.


Most HFTs primarily engage in market making (matching buyers and sellers) which absolutely is a very useful and necessary function to society unless your position is that markets should not exist.

https://en.m.wikipedia.org/wiki/Market_maker


You could hold the position that markets are good but not everything should have a market.

For example you can forbid futures and derivatives without forbidding good old stock trading.


I'm against pure speculation, but forbidding futures outright would feel ... wrong. Locking prices for contracts up front is a useful thing.

However, it might be much more interesting, and societally more beneficial, to require that anyone trading in commodities futures must have, at all times, the facility to take delivery of the contracts. The upper limit of exposure for a trading desk would therefore be bound by the capacity of their physical infrastructure.


That would lead to wider spreads. Then, liquidity would move to another country that did not impose that rule.


Futures are massively useful to both halves in commodities markets (producers and consumers). Why would you want to ban them?


Eh. I see the downsides of a lot of that HFT stuff, but there are upsides too. Yes a lot of it is zero sum, but not all of it is. Lowering spreads between currencies, say, does materially help non-finance actors. There are other areas of the stock market that are useful too. Ingesting financials and other non-manipulated data to better reflect a company's true worth at any time helps, for example, employee option holders that seek to have a fair renumeration for their labour.


Market makers are always talking about "lowering the spread" being this great thing they're doing to make the world a better place

Yet when I go somewhere with no liquidity and a huge spread like a crypto exchange, a deeply unpopular corner of the stock derivatives market, or a Craigslist used stuff category, the wide spread is just a mild inconvenience at worst. You get to choose between waiting for a better deal and taking a worse deal immediately

A tight spread just means that somebody is getting rich by taking that choice away from me and everyone else on the exchange. It's not the most nefarious thing in the world, but it's not particularly helpful or altruistic either.


"Yet when I go somewhere with no liquidity and a huge spread.."

Someone who wants to buy or sell goes to a market in order execute at the best price achievable, and they may be under time pressure.

If the spreads are wider at one place than another, participants will gravitate to the place with the narrower spreads. If there is better liquidity at one place than another, activity will move to that place.

The purpose of being at the market is that you want to trade.

These qualities that you dismiss, "a mild inconvenience at worst", are the essence and measure of a market's effectiveness.

"A tight spread just means that somebody is getting rich by taking that choice away from me and everyone else on the exchange."

No, it is the opposite of that. It is when spreads are wide that there is easy opportunity for getting rich. Consider: it is more lucrative to buy from one person at 10 and sell to another at 20 then to buy from one at 15.01 and selling to another at 15.02.

"You get to choose between waiting for a better deal and taking a worse deal immediately"

That is not the choice. On a good market you get both a competitive price, and you get to deal immediately.

An order that you rest on the book is called a limit order. You can do limit orders on any market, even those where market makers operate. Creating a market that offers limit orders is easy. The more challenging problem is to create a market where people can come and place market orders that get filled immediately

Four years ago, I wated to sell stock to close a deal to buy a house. I didn't want to sit around for hours or days or weeks or forever tweaking limit orders, hoping the market would move in my direction and that the house would stay on the market. I went to market to execute, got my money, and put down the deposit.

If a company goes to a market to offset risk, they typically want the convenience of getting the deal done immediately. If they have to wait weeks to close the deal, the risk might already have past by the time they could get the deal done. Liquid markets with tight spreads get rid of the workflow and loss of time inherent to haggling whilst giving you justified confidence that you are getting a competitive price.


Unless I'm missing something, a spread between 10 and 20 is not a place where you can 'buy for 10 and sell for 20'.

Otherwise these 2 guys could fill their orders immediately.

10 20 means you can sell for 10 and buy for 20 instead.


My error, thanks for the correction.


I feel like you basically rephrased what I said, but added a little terminology around it and said it's a good thing. Market orders taking 1 second to fill instead of 80 milliseconds isn't a meaningful contribution to society.

Imagine a world with no market makers. There would still be plenty of buyers and sellers of SPY shares at any given time to keep a tight spread and fast execution, but people placing market orders would get slightly worse execution and people who can accept the risk of placing a limit order and waiting two seconds would get slighty better execution. The ONLY real difference is that there wouldn't be some market vampires magically extracting tiny bits of profit all day. Your Facebook shares would still move just fine even if nobody front-runs your order and takes a few cents from you.

I get it, market making is profitable and the victims are distributed widely enough/offset enough by trivial benefits that it's not a particularly bad thing to do. But they aren't making ANY positive contribution to society AT ALL by squatting on the exchange and intercepting all the market orders at a profit.

(Full disclosure: this whole argument obviously goes a different direction when you start talking about derivatives, since market makers are mostly the ones who create and offset the derivatives. I could believe an argument that they're making the world a better place by making it quick and cheap to mitigate financial risk)


What is your goal? Is it better markets, or getting rid of market makers?

You seem fixated on the latter, to the extent that you will define away the quality measures of a market to get there.

The vampire/victim labelling you use is unreasonable. Market makers rest liquidity on the book and then other participants choose to interact with them. Both participants have chosen to enter the deal.

You propose a market without market makers. If you set up such a book, I expect you would find that nobody would want to trade there because they will get better prices and more liquidity elsewhere. If it was a good model, then all the exchanges would be doing it.

You misuse the term front-running here. Front running is when a broker has an order from a customer, and places orders on their own behalf before processing the customer order. In doing so they would put their own interest ahead of the customers. Rules about front running are a form of consumer protection. On-exchange market makers do not have customers, so the concept of front running is not relevant there.


My goal is for market makers to quit publicly jerking themselves off over what great and noble people they are, since they're just useless scavengers.


Rephrasing the same thing but with a little more terminology is what market makers do. :)


Fintech covers the entire payments space too


Guess I'm thinking more of HFT. Normal payment processing isn't this affected by a leap second though as far as I know.


Maybe they could be affected and needed plans to avoid the impact as well, but unlike stock markets you can’t say pause payments globally for half an hour just to get through the leap second.


Anything built on the adversarial model is zero-sum - courtrooms, parliaments. But the systems have utility.


> Seems like such a waste of resources siccing a bunch of computers against each other in a zero sum game of stock arbitrage

Despite the useful service of price discovery (here are so many better ways) it is clear from the EMH that those computers are not doing arbitrage, they are front running trades.

Illegal. Criminal in USA (I think). But makes billions and billions for the already very rich. So that is why there is so much of it


Someone has to make the EMH hold.


I think it's the other way around. They had a problem which previously only impacted weekends, so it was written off entirely without consideration of whether this was happening by rule or by convention. They knew it would be a concern on any other day and yet did nothing until the day it was announced.

Idle curiosities can lead to their own waste, but the kernel panic was probably worth digging into earlier.


>I'm not sure what the moral is here, if there is one.

Apparently, its about as useful as the leap second itself ;)

I feel your pain though, as I've spent weeks on something only for it to be tossed away like it was nothing at the last second. I guess that's how Google devs feel when their projects are deprecated. At least theirs saw the light of day and provided some validation


Here, have a bright shiny imaginary internet point. It doesn't nearly do justice but thanks in any case for sharing your story.


A bit of a tangent but I have observed that whenever a networking record for bandwidth is broken it is typically by a nonprofit such as a university, but whenever a networking record for latency is broken it is more often than not by someone in the "fintech" industry developing a faster bag-passing mechanism.

It is clear to me that the disparity of latency creates islands of privilege. I mentioned this to someone in the industry once and they replied that what the layman perceives as parasitic middlemen actually provide valuable liquidity. When I asked whether they considered ticket-scalpers to likewise provide liquidity they claimed that was not at all the same thing.


> I'm not sure what the moral is here

I think the moral is that it'd be a lot easier if we could just stop messing with the clocks, or at least push more technical things towards only caring about a closest-to-a-global-high-precision-monotonic-clock-as-relativity-allows rather than worrying about what the clocks on the walls say, which is more a personal matter of how much you care or don't where the sun is in the sky at 12:00:00.000.


I was gonna say, why not just close all positions and turn off the computers around the leap second? How much are you realistically gonna lose by missing a few minutes of trading, compared to the alternative risk?

Edit: I guess the other way to look it is I guess now how much you can make on a few minutes of trading, seeing that it was worth putting at least one software engineer on it for a long time despite the risks...


It can be extremely costly to close positions (often from a tax perspective this is a big-no no in some cases too).


> I'm not sure what the moral is here, if there is one.

As the CIA director in Burn After Reading says, "I guess we learned not to do it again."


It is a uniquely crummy feeling to have your work go unused like that, but you shouldn’t let it discourage you. You reached a level of mastery on this particular thing that few people have, which is evidenced by the fact that no one else in the trading community was able to reach your company’s level of confidence and they decided to wait out the leap second instead.


> no one else in the trading community was able to reach your company’s level of confidence

So his work contributed to community wisdom, and that influential community has probably had some say in cancelling leap seconds. I wouldn't call his work wasted. I would call that notably few degrees-of-separation in making an observable difference.


> I'm not sure what the moral is here, if there is one.

Always procrastinate :-)


Moral of the story is that sometimes social engineering is much cheaper and more effective than software engineering!


Yes, I agree, and I do think this is another example of worse is better. The complex but correct solution is to the hard work the OP did. But the simple but better solution is to simply halt the markets.


Contingency plans have their own contingency plans. Maybe trading companies started talks to stop the market months before your company assigned that task to you, in case of no agreement or a negative one.


It's the software engineering equivalent of the crypto nerd with the super-strong encryption, beat by a five-dollar wrench attack[0]. ;-)

Sometimes the best (for some definition of best) solution to a problem is to side-step it entirely.

[0]: https://xkcd.com/538/


The moral here is that you and people in similar positions convinced everyone else that there was too much risk to go forward. Either by direct or, indirect action and implication. Sometimes, just seeing what your own team needs to feel safe and seeing what everyone else is or not doing on the same front is enough to make the call one way or the other.


We just enabled leap second smearing on chrony.


I think that's the only reasonable way to handle this kind of thing, though I bet that accurate time matters enough in fintech that you'd still have some cases where you'd need access to the "true" wall time in order to stamp logs for auditing or whatever.


Has anyone ever heard about audits concerned with subsecond timestamp resolution?


Yes. I've been involved in a few. Well, the resolution and accuracy were part of the audit scope, certainly.

Quite common in real-time sports betting world.


Thanks for the chime in. I don't work in fin or even adjacent to it, but in robotics we often have to correlate logs between multiple systems to fully understand a failure and in a lot of those situations milliseconds do matter— when did the sensor reading come in, how quickly did we understand it and alert the other unit, how does this line up with the timestamps on a security cam video we don't control, etc etc.


The interesting thing is that for the less careful, the 15-min before/after halt may have been not enough. You knew enough not to use a time smearing NTP server but others that didn’t obsess as you did might have been off by a fraction of a second for the entire 24 hour period leading up to it.


The moral is that sometimes we humans can choose not to let the perfect be the enemy of the good (enough).


What if all trading world would not agree? No one could knew that in advance.


A bit like buying car insurance and not claiming. Still possibly worth it.


Don't conclude it's that hard for everyone until you've spoken to a good subset of different people.

You're conscientious and willing to dig in to the details to fix a problem. Plenty of people aren't, and plenty of those are doing the same job as you. Look up from your own little world and try to figure out what other people are doing, how they're doing it, and why. This applies generally: If you fixate on a specific language or toolkit, you'll miss others which fix or obviate bugs you were resigned to living with. Same with OSes and environments. It even applies to relationships, which is why a big hallmark of abuse is isolating the victim.


I worked in algo trading at major bank in Japan. Japan time zone is UTC+9. Markets open at 9am. A leap second brought down our trading right at the open.


Moral: “if there is a lot of work involved solving a technical problem, there’s probably a lot less work involved solving it non-technically”


Moral of the story is that insurance is expensive


Wow. You should write a post about it that goes deep into what you had to do to make it work.


>I spent a month in the lab

Do you mean at your desk? What is a lab in a fintech context?


For most of us that are doing implementation engineering, a lab is simply a collection of the gear that can be put together in a simulation of the production environment without being constrained by formalities. For me it would be a bunch of network and server kit and cables in a rack.


Not OP, but at several jobs our labs were small server rooms stuffed with network gear, servers, and client PCs. They were used for end-to-end simulations and tests. It wasn't uncommon to actually do work in the lab, keeping an eye on the blinky lights or somesuch.


I think the moral is, “those who fail to prepare are preparing to fail”.


The moral of the story is if everyone is slacking then you can as well


The moral is it’s a waste of time either way


Couldn't your "fintech" company decide to halt trading 15-min before/after on its own without the agreement of the trading world?


You did the good job


Currently software has to be built to accommodate leap seconds. They happen frequently enough that you'll find out within a few years whether your software breaks when time suddenly skips forward or backward.

If we kick the can down the road such that eventually we'll need to add a leap minute, we're going to end up with software that was never written to expect time to change in such a way, hasn't had a real world test of the change for decades, and will have no one working on the software who ever had to deal with such a change.

It's going to be much worse for software reliability to have a leap minute on the order of once a century than a leap second every few years.


Why would we ever need a leap minute? Just let time zones drift. Unless you happen to be at the exact longitude for which your time zone is correct, the sun's not at its highest point in the sky for you at exactly noon anyway.

Eventually, thousands of years from now when time has drifted by an hour or more (assuming modern technological civilization even still exists by then), each jurisdiction can just change their time zone's offset from UTC, without coordinating with anyone else. Jurisdictions making time zone changes is a well-understood situation that we already know how to deal with.


The last part is right and is also why we have this problem.

Leap seconds are an architectural blunder that always belonged in the abstraction layer that lines up the sun with the rotation of earth (the time zone abstraction). It never belonged in the part that counts seconds.


That's a great way of putting it!


This is definitely the most interesting point in this whole comment chain.


You're right, but I'd argue this problem is already here.

Thanks to glaciers melting, earth rotation is (temporarily) accelerating. Because of that, positive leap seconds, regular before, didn't happen since 2017 - so there could very well be (recent) software out there that has that code-path broken, and nobody noticed yet.

And due to exact same geophysical effect we might see a negative leap second - something that never ever happened before. What are the odds that every single piece of software gets that one right?


Interesting point. Just like a skater pulling their arms in, snow melting and running to lower ground would make the earth spin faster (and let's add in the extra erosion). Sea levels are rising though (due to both thermal expansion and runoff).

Be interesting to estimate the size of the various effects (no doubt I've missed plenty of others) but is it really true that a change in sign of the acceleration of Earth's angular velocity is down to climate change?


I don’t think that’s right. I would think the effect of melt of glaciers on mountains would be negligible and the effect of the poles melting and sending that water down to the poles would slow down rotation, like an ice skater pushing their arms out.


All the ice isn't /exactly/ at the poles, but well down the side as well. Also, ice is less dense than water.


The rate of drift is so slow that people will never care.

Even over thousands of years when an hour of drift is accumulated there won’t be a manual adjustment - people will have just gotten used to different times of day having sunlight, with generations having been born and died with mean solar time happening at 11am.

Eventually the rotation of the earth may change enough that drift accumulates too quickly and leap time needs to be added, but that’s only going to be true thousands to tens of thousands of years in the future.


I've always thought that the best way would just have leap seconds every six months, and have it be ± 1 second no matter what - so sometimes you'd go back a second twice in a row, and then go forward a second to get back to correct.

That'd test all the software paths.


> That'd test all the software paths.

The daylight savings time bugs I've run into at pretty much every company I've ever worked at would beg to differ.


See also leap year bugs even though that concept has been around even longer than DST.


My favorite is comparing aggregated data across several time zones when DST changes happen.


Shouldn't you just aggregate based on UTC? Leap seconds still cause a problem here but daylight savings shouldn't matter.


People expect e.g. their "daily power consumption" to end at midnight local time. When they zoom in on the daily aggregates, they don't want a spike to suddenly disappear just because it belonged to the next local day and you only stored UTC-day aggregates. Even more so for billing.


The problem is that if the originating system didn't record the time in utc, you've lost the information needed to make the time unambiguous. What's 2022-11-06 01:30 USA/San Francisco in UTC?


How does that solve comparing days with 23 to 25 hours of data when aggregated by hour? Or are you suggesting using days in UTC?


Making time less accurate just for the sake of testing in production doesn't seem great.


From 2035 it's going to be less accurate in exactly the same way. I'd argue bombcar's method is more robust.

I don't know of anything that actually requires UTC to be within 1 second of average rotation-based time; having it within 2 or 3 seconds is extremely unlikely to actually break anything. But we do generally want to have measured time roughly in line with Earth time in the medium to long term.


Today it is within I think +-3 hours of earth time (China being the worst offender), because of timezones and DST. If it ever gets worse than half an hour in a country, they can always change the timezone... or just, like, change businesses' opening hours and stuff.


We have summer and winter time because that turns out to be easier than changing business hours. But I was just talking about UTC really.


How about clocks on Mars? Should these have Earth leap seconds? Time synchronization is already hard enough, complicating it with adjustments based on uneven planetary body movements doesn’t make much sense in a long term.


In the longer term Mars should probably have its own time scale. When we get to further colonization maybe we can use TAI for interplanetary use (but time dilation due to relativity effects is going to be a problem when going to other stars) while still using UTC on Earth.


Better make it every other day rather than every six months.


Henceforth every minute will be "bell curve distribution around 60" seconds long, and each hour will be "bell curve distribution around 60" minutes long.

This variability will provide excellent fuzzing of all time libraries everywhere.


It would test all the software paths in production though. Bad software would still fail and need last minute patches every 6 months.


Nobody really cares about clocks being celestially "off" by a minute either.

So this isn't a once-a-century thing, it's adding-a-leap-15-minutes-once-a-millenia issue.


https://royalsocietypublishing.org/doi/10.1098/rspa.2016.040...

Odd. I was wondering how fast it actually was long term, and this rate from historical record seems much lower. They cite 1.8ms/century if I'm reading it correctly with some odd cyclical thing going on. " the change in the length of the mean solar day (lod) increases at an average rate of +1.8 ms per century. "

I mean, we've added 22 seconds over 50 years. Although at current rate it would still just be 7 minutes after a millenia :)

edit You know, nevermind, that's all covered on wikipedia. https://en.wikipedia.org/wiki/Leap_second#Slowing_rotation_o...


The rate accelerates long term, but yeah.


> Currently software has to be built to accommodate leap seconds

Plenty is not.

The Swift time type ignores them in its implementation. I filed an issue, and they said there were no plans to implement them.

Good choice it turns out.

Who would of thought that adding a second on random new year changeovers would be worse than letting clocks drift.

Me for one.

We can have a "leap hour" in a thousand years. Till then do we care, I do not, if the clocks and the sun drift very slowly apart from each other?


In an enterprise environment, every leap second or TZ change invokes a ceremonious updating of anything using Java or a JVM. I suspect a lot of people would rather perform a once a century update than the bi-annual process they do now.


The thing is that there will be entire Java-like ecosystems that will have shorter longevity than the time between leaps. So practically all of them will have no facility for a leap at all.


Did we just create a Cobolesque Y2K situation for Java?

I can just imagine Graybeards of the future rushing ahead of the leap minute to update the JVM lest the world goes in flames yet again.


Indeed; the converse would be to add (or remove) micro-leapseconds in order to keep accurate time. That way it would happen so often that the code would get tested and we'd have working implementations (maybe we could fix the traditional Apple issues with leap years at the same time).


I’m pretty sure we would get bugs whenever the accumulated micro-leapseconds would reach a full second.


Mmm...true, some programmers even have problems with gettimeofday().

The point still stands though - common problems have tested code, uncommon/rare problems rarely have battle-hardened solutions.


It's going to be centuries until the difference between astronomical midnight and UTC midnight is more than an hour, which is the exact same amount of error we deliberately inflict on ourselves to have more daylight in summer evenings and half the error that residents of Spain experience by as a consequence of the political decision to join the same time zone as Berlin. If human civilization exists long enough for this to be an actual problem, we're probably going to have to figure out how to coordinate time with multiple space settlements, which is going to be a much harder problem thanks to time dilation.


Why bother? A calendar year is 365.242 days long... time is already off by about 3/4 of a day the year before a leap year, what does adding a second here or there really do?


Time is not off by about 3/4 of a day, calendar is. And hardly many people care about planet's position relative to the Sun (on the orbit) — I mean, they can, if it is important for they occult rituals or something, but probably they learned to work with it over the centuries already.

Earth's rotation relative to the Sun is whole another deal.


Well actually people did, or we'd still be on the Julian calendar.


clocks get used for two distinct purposes, often at odds:

- the measurement of durations.

- the presentation of some timestamp in a way that the reader has some intuition for.

that first purpose won’t be hurt by not tracking leap seconds. actually, a lot of applications will probably more accurately measure durations by eliminating leap seconds.

if leap seconds (or minutes) really are of critical importance, we’ll reintroduce them to the presentation layer. the thing is, very few people can tell the difference between 12:01 and 12:02 without being told the “real” time. so if you’re presenting a time which is “off” by a minute because there’s no leap seconds… does it really matter?


There should really be three "layers" of time indirection.

1) Seconds since 00:00:00UTC 1.1.1970. This value increases by 1 each atomic second and never goes forward/back. Call this Universal Monotonic Time.

2) The difference between when the sun is at its zenith at Greenwich and 12:00 UMT. Call this the Astronomic Drift.

3) The timezone - offset from Greenwich that makes the local clock sync up with astronomic time and also contains a DST offset if that location observes DST at that date.

By adding up 1) + 2) + 3) you end up with the "human time" at a given location at a given date.

A computer system should only ever store 1). Then, it can calculate human time when displaying it to humans.

I'm also a fan of having "local sun time" which would be the time according to the position of the sun in the sky, quantised to 15-minute slices (basically micro-timezones). It would be nice if office hours, school times, &c can be defined based on that, i.e. work starts at 9am local sun time, which will sync up better with people's biological clock and cut down on the yearly stress DST causes.


I agree with your separation between "duration" and "absolute point in time". But it doesn't solve the issue, because durations are often computed as the difference between two absolute points in time. You could get over this on your local machine with a local counter, but across network boundaries you need to rely on absolute differences.


> But it doesn't solve the issue, because durations are often computed as the difference between two absolute points in time.

other way around. e.g. unix time is determined by the number of seconds (as experienced by some point fixed to Earth, roughly) relative to some reference point. duration is the native format for most time systems, like UTC, because “absolute time” isn’t a thing that can be measured. faking the duration (e.g. adding leap seconds) within a time system is sort of nonsensical: we only do it to make translation across time systems (UTC, UT1, local/human timezones) simple. if that’s the justification for things like leap seconds, then better to keep the time system itself in its most native format (assuming that format is useful locally) and explicitly do those conversions only when translating, i.e. at the network boundary: when communicating with a system that doesn’t share/understand our time system(s).


I regret that the reality is such, but Unix timestamp are in sync with UTC so they too have gaps and reapeats (for positive and negative leap seconds)

The international standard for monotonic time is TAI, which never had leap seconds, but which is also used by almost no one


There has never been a negative leap second.

>If we kick the can down the road such that eventually we'll need to add a leap minute

The drift is not in a single direction. The total drift is not going to be significant for a very long time if ever.


The total drift is accelerating long term, because the tidal deceleration of Earth‘s rotation due to the moon is quadratic.


It will be an urgent crises in a century, a millennium...


> There has never been a negative leap second.

That doesn't stop the (e.g.) FreeBSD folks run tests to make sure things are fine:

* https://lists.freebsd.org/pipermail/freebsd-stable/2020-Nove...

* https://docs.freebsd.org/en/articles/leap-seconds/

Of course there's a whole lot of other userland code besides ntpd.


Nah, you just smear time faster or slower so you never need to "leap".


Hot take…

Storing anything as UTC was a mistake and we should be using TAI for all storage and computation, only transforming into more human friendly formats for display to end users. This never needed to be a problem except we decided to make it harder to use TAI than UTC and so everything got built up off the backs of legacy bios level hardware supported UTC style clock behaviour, when we should have been using TAI from the start. Yes I know it would have been harder, but we got off our collective asses and decided to fix our short sighted decision making for Y2K date storage, why not this… if it truly costs as much for everyone to endure a leap second why wasn’t it just fixed from the bottom up and rebuilt correctly!


I'm not sure I support your "hot take" — it is hot and requires a lot of contemplation. But that's not the point, IMO.

The point is, that there ALREADY EXIST both TAI and UTC. TAI is true monotonic (whatever it means in a relativistic universe) and doesn't make any compromises. UTC abolishes monotonicity in order to keep both the length of a second and the time relationship to the orbital rotation. They both work. For whatever reason (for obvious reasons that is, but doesn't matter) UTC was chosen in virtually any software system to keep time.

So, ok, if there is a suspicion of leap seconds being unnecessary. How about moving from UTC time to TAI then? Let's keep UTC as it is, keep adding leap seconds and just make it a best practice to rely on TAI for all datetime operations and world clock synchronization? Maybe it will work out, maybe it won't, but at least you won't be breaking a perfectly working alternative (currently — mainstream) system.

The more I think about it, the more outrageously stupid abolishing leap seconds seems.


Eliminating leap seconds was only a half measure. They should have finished the job by adding rockets to the Earth to ensure that its rotation will stay at exactly 24 hours.

If they weren't going to do that, then why eliminate leap seconds? Kicking the problem down the road doesn't really solve the problem, it just makes it worse later.


> its rotation will stay at exactly 24 hours.

The earth rotate around itself in 23 hours and 56 minutes

https://en.m.wikipedia.org/wiki/Sidereal_time


It rotates itself in 23 hours and 56 minutes relative to the fixed stars. It rotates itself in 24 hours relative to the Sun. It is Earth's rotation relative to the Sun what people care about in most situations, because day and night depend on that, not on the position relative to some far-away stars.


24 hours is an average over the duration of the Earth's orbit around the sun.


Seems like a problem to be solved with more rockets!


Surely it’s all the test firings of Merlin and Raptor engines that’s slowing the earth down in the first place? I mean first he screws up twitter, now it’s time itself.

(/s)


He is absolutely, singlehandedly destroying Twitter-- no sarcasm.


> For whatever reason (for obvious reasons that is, but doesn't matter) UTC was chosen in virtually any software system to keep time.

What was chosen really isn't UTC. Several UTC seconds in the past are not accurately representable in unixtime. Several unixtime seconds in the past are ambigious as to which UTC second they are.

Unixtime is awfully close to UTC time, but it's not the same. If UTC time stops inserting leap seconds and never has negative leap seconds, then they will be equivalent going forward.


Not really. I mean, it's true that we should distinguish between the 2 and everything that you said about the difference is also true. But unixtime doesn't really "exist" in a sense UTC and TAI do. It is rather an imperfect implementation of UTC, that chooses to ignore (or repeat) some seconds.

You can hear that unixtime is the number of seconds passed since X. But it isn't really true though. The number of seconds is number of seconds, it isn't defined by our standards, it just exists. And TAI is a fair representation of how many seconds on Earth actually passed (on average) since whatever.

UTC kinda does it as well by virtue of 1 second being equal to 1 TAI second, but it actually counts (counted until yesterday) the number of Earth's rotations. It's just every rotation (represented by 24H) sometimes consists of more than 86 400 seconds.

Unixtime on each individual device counts nothing. It imperfectly represents UTC timestamp received over NTP. Some timestamps are represented twice by the same value. Of course, you can just put your device offline and call it "unixtime" whatever number of seconds it counts since any moment, but you know it will drift away from any meaningful "real" time soon enough.

(Also, it's not even entirely fair to say, as you did, that it is unixtime that was chosen by all the software. Many programs store datetimes as strings. Usually, they still don't support "23:59:60" anyway, but that doesn't really make them unixtime. Unixtime is a timestamp encoding.)

So, that's basically what I'm talking about: you can just make unixtime an implementation of TAI (as opposed to UTC). You can build a new calendar format for it, introduce a new name (not UTC!) for it and see how well it does when all the world slowly drift from Earth's rotation to keep up with TAI. Maybe it actually is fine, I'm not a judge for it (because it really is complicated and I didn't decide yet if it's a good solution or not). But why the fuck would you destroy UTC for it?! It is a closest usable representation of UT1, which doesn't stop to exist! Leave it be!


Consider that the most widely deployed time standard outside the UT framework is GPS time, which was initially synced with UTC but is de facto TAI-19, because it turns out that constant-rate monotonic time is useful and UTC fails at even the alleged astronomical use-case.


If you want to keep UTC matching the solar rotation, why do you specifically require leap seconds? Why not leap minutes? Or leap milliseconds? The choice of ±1 s as the acceptable error seems arbitrary.


It is worthwhile to look at the past; before leap seconds the disagreement with UT1 was handled by changing the rate of UTC slightly, and that's probably even worse than leap seconds. And for a while the unit of adjustment was less than a second, varying from 0.05 to 0.2 seconds. I believe enough people have complained about subsecond adjustments, and thus leap seconds have survived only because not enough people have complained at that time.


Because a minute is a huge amount of time (and, by the way, don't forget, that minutes don't exist, I just understand that you mean 60 seconds) and there is no such thing as "leap milliseconds" because UTC is literally just counting seconds. Basically, UT1 already is an implementation of what you call "leap milliseconds". Not literally, but achieves the same thing. And it is way too complicated to use it in practice outside of astronomy-specific tasks.

So, to sum it up:

- TAI is a real thing, it has a concrete meaning, and it's "leap infinity".

- UT1 is a real thing, but is unusable in practice, and you could think of it as "leap ms"

- UTC until yesterday was a real thing, meaning time, which has seconds equal to TAI-seconds, but not drifting from UT1 for more than 0.9 s. Since today it's broken and I'm not sure what it even means anymore — I mean, not in practice, but "platonically".

- Nobody just introduced a standard that would mean "time with seconds equal to TAI-seconds, but not drifting from UT1 for more than 59 s" yet. I guess you could be the one to do it, but I'm not sure it would get a wide adoption.


Why is UT1 unusable? Can we approximate it with something like a polynomial that gets updated periodically?


Pretty sure you are saying the same thing as the post you replied to.

They are recommending TAI for storage, compute, and not against UTC for human consumption

Although the easier hack: Abolish leap seconds from UTC!


I now realized that the parent meant everything should have been TAI from the beginning, which is indeed a valid take. We can't switch to TAI today only because we are already using UTC.

Original comment: Only those who never tried to actually use TAI yourself can claim that you can use TAI instead of UTC without a problem.


Just have 37 leap seconds to bring UTC into sync with TAI, then lock them together.

That won't break anything...


Yes, that's what the CGPM eventually decided to do because they know UTC will have to stay.


As spacecraft start making return trips to Earth we'll run into similar adjustment problems because TAI doesn't progress at the same rate at different altitudes and accelerations, so there'll need to be a re-sync at some point. Satellites already have a similar problem but can just use direct synchronization with Earth since it's so close. In theory spacecraft could do the same direct synchronization to Earth time, but e.g. a computer on the dark side of the moon would need additional relay(s) to stay in sync.

I'm not sure why we don't define an intergalactic time standard and approximate that everywhere with NTP-like protocols; one monotonic clock at rest (with respect to CMBR) in free space. The second is weirdly defined/tracked in Earth's gravity well.


Right now there are no people outside Earth's gravity well for significant time.

As long as the Earth exists and you can communicate with people there, there's no practical reason not to use an earth based reference clock.


Most of the accurate timekeeping is needed by computers, not humans. We can always define local human time with an affine transform from a coordinated monotonic clock.


TAI is independent of the altitude.

The clocks that were used to build TAI (its a co-ordinated average of dozens of atomic clocks around the world) became sufficiently accurate that the difference in each second based on the altitude of the clock measuring it in the early 70s. As a consequence of this it was decided that as of 1 January 1977 00:00:00 TAI would be corrected to correspond to what TAI should be if measured by clocks at the geoid (mean sea level) and as a result it has no relation to altitude or accelerations. There is also (because metrologist are like this sometimes) a continually published version of what TAI was before 1 January 1977 00:00:00 but it is now named EAL (Échelle Atomique Libre, meaning Free Atomic Scale)

In addition to this, we have already designed and maintain equivalent time standards to TAI, but for the Earth's barycentre Geocentric Coordinate Time (TCG - Temps-coordonnée géocentrique) which is roughly speaking TAI for a clock orbiting the sun, where the earth moon barycentre orbits, but without the earth & moon gravitational influence... and for the entire solar system Barycentric Coordinate Time (TCB, from the French Temps-coordonnée barycentrique) which is roughly speaking again, equivalent to a TAI style clock but this time subtracting the entire solar system, as if a clock keeping TAI was just orbiting the galaxy at the barycentre of the entire solar system.

The cutting edge of this is building up astronomical data on ultra stable pulsars to use as "external" reference clocks far outside the solar system, but the complexities of subtracting the effect of all the universe the pulsars' radiation beams pass through before they get to us, makes it quite challenging. But the utility for deep space navigation has made it an active funded path of research for at least the last decade. (to get a GPS equivalent at lunar distance and beyond where it becomes rapidly impractical to have a GPS like orbiting constellation due to inverse square radio broadcast power limits, good radio can pick up GPS at the moon but the location precision out at that distance... is not great)

The cosmic microwave background dipole may be indicative that we can use it as an absolute frame of reference but settling that with enough certainty to base an official time standard on it, seems like some time away based on the state of things between cosmology, astronomy, astrophysics and metrology.


Is sea-level rise predicted to mess with TAI, then?

I really hope we can get something like TCB standardized for computer use with affine transform to human-usable times.


The issue doesn't go away.

> more human friendly formats for display to end users.

This is what's doing the very heavy lifting in your proposal. Leaving aside not knowing when future leap seconds will occur (and thus getting mismatches when broadcasting to different computers that may or may not get the information about the leap seconds at different times) the sheer fact of the matter is that software developers are users too. They will take shortcuts and display TAI as UTC because "something, something people are lazy or uneducated."

We do not need leap seconds. We never should have implemented them. They are a scar on our software for potentially hundreds of years already for any application that seeks to have high accuracy over time.

Time is very frequently a join key or part of a join key in a database and these small differences mean countless hours wasted to investigate "couple of record" mismatches.

Just stop using leap seconds. We will be fine.


You've made a very good point. All new software/systems I build will use TAI64. As an industry we should just push this move ourselves


Libraries are already available. See https://cr.yp.to/time.html and the pages linked from there.


Yep, and here it is one in Erlang – Taider https://github.com/secYOUre/taider


Arguably, being able to perform calendar calculations without leap-second information (which you don’t have for the future) is more important in applications than maintaining to-the-second accuracy over calendrical timescales.

What applications and date-time libraries should really do is differentiate between timestamps, calendar plus wall-clock time, and elapsed runtime. In most circumstances, only the latter would really need to be consistent with TAI.


You don't have dst offsets for the future either. Those keep changing. That's not a blocker.

The straight up truth is that we created something in between. Some parts of earth sun alignment are in the time zone abstraction layer and leap seconds are in the seconds count layer. There's no real cause for this and we should have moved to TAI to fix this blunder long ago.


You don't have DST offsets at all in UTC, which is probably how you are storing timestamps, so you can always compute a delta between timestamps if you don't have leap seconds.


That's the whole point i'm making here. UTC and leap seconds could always have been separate in the same way timezone+dst offsets are separate. In fact leap seconds could have just gone into the timezone offsets directly and it'd all work fine (we'd probably still call it UTC +10 rather than UTC +10:00:27 but that's not a problem and to be honest we'd probably not bother at all until the offset was big anyway).

There's no reason this wasn't done many years ago except for a blunder on the part of CGPM which they are now working to correct.


Well, in principle there’s TAI for that. But civil time in most countries is defined in terms of UTC/GMT.


I agree actually. A lot of the blame is on unixtime trying to roughly align to UTC rather than TAI.

But we're here now and we have to fix this. So anything that moves us away front his problem is helpful.


Absolutely! A large part if h the blame here rests on Unixtime (which is a monotonic count of seconds elapsed) trying to use a time standard where those seconds are not innfact monotonic. Unixtime is basically “TAI done wrong” and the failure to correct this early on … ideally they should have aligned themselves to TAI instead of UTC since originally Unixtime was not actually aligned to any particular time standard then in the mid 70s it was decided to align it with elapsed seconds of UTC time as of a fixed date.

This decision just dominoed down through time causing enough friction that we computer programmers outweighed the metrologists and they caved and abandoned properly keeping UTC in order to stop causing problems for everyone else due to our continued failure to fix the root cause of these issues.


What would using TAI solve exactly? I'm unfamiliar.


TAI is monotonic clock, and isn't adjusted for solar time of day, it could be considered universal as TAI would be the same between any 2 points, but UTC is adjusted for earths rotation, so a theoretical mars UTC would end up out of sync with earth UTC.

EDIT: info below is incorrect about UTC not being monotonic, as pointed out in thread but is useful for monotonic vs non-monotonic:

In UTC you can jump forward or back, so it's possible to do an operation after another operation but have a timestamp before it, which is bad for many reasons, top being auditing.

do operation one at T0 do operation two at T1 do operation three at T-1

in TAI it would always be do operation one at T0 do operation two at T1 do operation three at T2


Link for anyone interested: https://en.wikipedia.org/wiki/International_Atomic_Time

I'm not sure how this differs much in practice from UNIX time


I'm pretty sure UTC is monotonic too, its unix timestamps that really are the true mess


Yeah you are right, when a leap second is introduced it becomes 23:59:60 (monotonic) in utc, while with unix-time a normal way to handle it is to repeat 23:59:59 twice (non-monotonic).


Leap seconds can be negative.


A negative leap second means that 23:59:59 is skipped, you go from 23:59:58 to 00:00:00, which is monotonically increasing, in both UTC and unixtime.

Positive leap seconds are monotonically increasing in UTC, where you get 23:59:60 between 23:59:59 and 00:00:00, but not in typical implementations of unix time where 23:59:59 repeats, and with milliseconds you go (breifly) 58.999 -> 59.0 -> 59.999 -> 59.0 -> 59.999 -> 0.0

If you're only counting seconds, then both UTC and unixtime are always monotonically increasing.


Wrong. The very topic of this discussion, leap seconds, are non-monotonic adjustments to UTC.


UTC is monotonic, even during leap seconds. (A positive leap second adds a 23:59:60; no timestamp ever repeats, the clock never moves backwards.)

(Negative leap seconds are similar, and do not affect the monotonic property.)

POSIX/Unix timestamps, however, are non-monotonic. But that's a different timescale.


In what way are leap seconds non-monotonic in UTC?


Datetime storage would consist of two explicit parts: one free from leap seconds (similar to the raw timestamp you get from a GPS receiver), and description of when leap seconds happen, so that you can transform the leap-second-free timestamp into UTC. Feels like a more robust way in principle to me.


So like a CRDT for time


CRDT = Conflict-free replicated data type, https://en.wikipedia.org/wiki/Conflict-free_replicated_data_...


Hear hear. Computers should store TAI and convert to human time using leap seconds, time zones and DST offsets. I wonder if we could roll the leap second into the timezone and just remove UTC entirely (or rather make UTC an alias for TAI).


> we should be using TAI for all [...] computation

How do you add a full day, if you do not know whether a leap second occured or not?


The same way you add a full year without knowing if a country will change timezones or vote to drop daylight savings: you distinguish between timestamps and calendar dates, and keep updated your timezone/leap second database.


That doesn't answer my question. What I mean is that "a day" is not a fixed length, but depends on the existence of leap seconds. Most days have 246060 seconds, but not all.

Say I have a timestamp in the future, and I want to add a day to this. Then the first approximation would be to add 246060 seconds. However if there occurs a leap second in that day, the correct thing is to add one more second. But I do not necessarily know today, if there will be a leap second. So I cannot compute the correct timestamp.


You have to differentiate between calendrical and chronological calculations.

So to walk through your requested example…

You store a TAI time stamp of when you originally chose a specific date/time and then you store either a chronological time stamp (the TAI value) or a calendrical timestamp which is a struct/dict/etc storing the desired year month day, time and timezone and notionally the calendar system unless you want to assume the Gregorian calendar and risk confusion with Julian calendar dates which are still in use for astronomical record keeping. With these two values you can calculate what to change in the struct to account for any time change, a day not being fixed as a number of seconds allows accurate calendar calculations. You increase the day value and make sure you don’t have to “carry” a month in the months column.

Yes this is more calculations when modifying a value compared to just adding a bunch of seconds but that’s the issue at the heart of this problem, calendars and chronological elapsed seconds are fundamentally different and we computer programmers tried to take a big shortcut by using a single seconds value for both… and thus… here we are with the present state of affairs.


> Storing anything as UTC was a mistake and we should be using TAI

I had to look up TAI. I disagree. UTC exists for a reason. But I am here to tell a war story.

Before I arrived on the scene a customer said they wanted local time in the reports. (As in Wall Clock Time).

The Customer is Always Right. OK?

So the times went into the database as wall clock time.

The designers of the data schema made a simplifying decision to store everything as text.

No time zone went into the text string that described the time. (?? I do not know why. So easy it would have been)

I come along and have to code animations using that time series data.

Very few problems...

As you would expect there are a lot of other problems with that database.


> The CGPM — which also oversees the international system of units (SI) — has proposed that no leap second should be added for at least a century, allowing UT1 and UTC to slide out of sync by about 1 minute.

So, in one century, we'll get 1 minute's worth of drift.

Recall that we all share the same clock within timezones, and 1 minute of drift between atomic & solar clocks is the equivalent of traveling 1/60th of your timezone's width to the east or west ... something many people do every day as part of their commute. _Everyone's_ clock deviates from their local solar noon, and _nobody cares_.

Put another way: (at most) one north-south line in your timezone will have solar noon & clock noon line up. Over time the relative location of that line will move. Fine. Let's not screw with our clocks in an effort to keep the location of that line fixed.


The problem is mostly (or so I’ve heard) that the drift is relevant for astronomical applications, and they rely on time dissemination which is done in UTC. If UTC decides to start deviating from TAI by dropping leap seconds, those applications will be in trouble. I’m sure that the problem is overblown, but this is the reasoning that was put forward in the past.


People who are using time for astronomical applications will simply have to track the offset if they need to. I'm not sure how that's much more problematic than the current situation.


Agreed. _Someone_ has to deal with the drift between astronomical time and earth time, better the class of software that deals with space than every time critical system in the world.


But leap seconds hose up astronomical applications too. They all start with TAI and then apply corrections as needed. There are lots more corrections than leap seconds, but they're applied proportionally, not as leaps.

Leap seconds are a solution in search of a problem.


And of course there are things like Spain being in utc+1 despite sitting almost entirely west of Greenwich.


I can barely wrap my head around the idea of living in UTC when I remember that, my friends and I already feel like the sun sets too soon these days. We're just so used to 9:30pm sunsets during summer!


> Although human timepieces have been calibrated with Earth’s rotation for millennia, most people will feel little effect from the loss of the leap second. “In most countries, there is a one hour step between summertime and winter time,” says Arias. “It is much more than one second, but it doesn't affect you.”

I find this off-hand comment dismissive and out of touch. The semi-annual switch from Daylight Savings to "normal" and back again is absurd, and far from not affecting anyone. Studies show that productivity drops for about a week following the change [1], and there is a marked increase in road fatalities after the clocks are adjusted [2].

If anything, this leap second business is what's irrelevant to everybody except a handful of obscure boffins.

[1] https://www.healthline.com/health-news/daylight-saving-can-m...

[2] https://www.boston.com/news/jobs/2016/03/16/daylight-saving-...


agreed. i’m sick of being gaslit twice a year when i go to bed, set an alarm for 8 hours from now, and instead it wakes me after 7 or 9 hours while insisting it’s really been 8 hours.

it happened again to me just a few weeks ago with the DST switch. i was so mad when i learned that everything had felt off the day of the switch because it was off that i just set all my clocks to UTC. no more gaslighting: my clocks all accurately measure intervals with no tricks, which is exactly what i most want out of my clock.


It seems I've got an unpopular opinion reading the comments here, but having some leap seconds from time to time ensure we are able to manage them. But "no leap second should be added for at least a century" ensures there will be a y2k reckoning every century. Seems short sighted to me, even if it is not a game-changing issue, let's be honest.


Why have them at all? how many people do things where they actually matter? at the current rate it'll be ~9000ad before we hit an hour offset, and we do hour offsets twice a year, so if an hour off is okay why is 1 second off not?


> at the current rate it'll be ~9000ad before we hit an hour offset

And the leap year problems of the Julian calendar weren't a problem… until they were. And then good luck coördinating things:

* https://en.wikipedia.org/wiki/Gregorian_calendar#Adoption_by...

* https://en.wikipedia.org/wiki/Adoption_of_the_Gregorian_cale...


The Julian Calendar was good enough for centuries after the empire that invented it had fallen, which I think counts as good enough in general. That said, I'd love to read a story about tracking down some time bug a million layers of abstraction down from the state of tech in 9000AD.


The thing is, we've remade calendars many times in the past millennium and our current calendar is not gospel. Sidereal drift can and should be addressed with a calendar fix, not a clock fix. The clock's a clock, the calendar's a calendar. It doesn't matter if we can make our clocks calendar-like because they shouldn't be anyway.


And that timescale is called TAI. It's always been an option, but for whatever reason, people (and standards) (collectively) use UTC but want it to act like TAI.


> we do hour offsets twice a year, so if an hour off is okay why is 1 second off not?

The actual Unix time does not change when the clocks are switched between daylight saving time and standard time. Only the time zone changes.

When a leap second occurs, the actual Unix time changes, which can lead to bugs, e.g. when a positive time difference comes back as negative. To prevent such issues, a monotonic clock can be used to measure time intervals.


Edit: OK, just re-read some docs, looks like POSIX chose the worst of all options and decrements the timestamp experiencing the same second twice...

Wrong info from before edit: Unix time doesn't change either, it's the number of seconds since 00:00:00 UTC on 1 January 1970, though systems may or may not ignore that and set it to match the UTC time.


Yes. It does. Op is correct. A leap second specifically is an adjustment because “how many seconds since 1970” changes because the earth’s rotation isn’t constant speed. UTC tone definitely changes. If it didn’t then you wouldn’t hear anything about it and it would just be transparently folded into your time zone database.


By that time we should be an interplanetary species and have a bigger time coordination fish to fry.


Ok let future generations take the hit. Got ya. Selfish ...


How big of a hit is it on the future generation, they can just change their DST, which already happens every few years anyways?


The critical question is whether they'll have to leap forward or fall backward. We simply cannot abide stealing an hour of sleep from our cyborg/transhuman descendants. We'd never live it down.


When longtermism meets leap seconds.


Hopefully by the time they need to make an hour's adjustment, we'll have done away with the anachronism that is DST. So then they'll have to learn to do it all over again from scratch.


The difference should be handled at the same layer as time zones, which is already a mess and is already well-suited to deal with this type of thing.

I'd argue there should be no adjustment until it gets to be +- 15 minutes from UT1. And even then, NO clock skewing, just a time zone adjustment across the board.


Its not a leap once a century thing, it's a rejigger the timezones by an hour every four to five thousand years. (either define new zones one hour over or redefine existing ones)

We already rejigger the time zones from time to time, so thats already handled. It's still a source of issues, sure, but it's once source instead of two.

The timezone rejiggering could also be set hundreds of years in advance... rather than the very short notice we get for leap seconds.


Leap seconds are not useful. We do not need to use them anymore. While yes you are right in that more leap seconds makes it easier to handle leap seconds, it is just better to not have to worry about leap seconds altogother. It is unneeded complexity that helps no one.


> How, and whether, to keep atomic time in sync with Earth's rotation is still up for debate.

[…]

> The CGPM — which also oversees the international system of units (SI) — has proposed that no leap second should be added for at least a century, allowing UT1 and UTC to slide out of sync by about 1 minute. But it plans to consult with other international organizations and decide by 2026 on what upper limit, if any, to put on how much they be allowed to diverge.

So everything about this hasn't quite been sorted out yet.

At some point there may need to be a reckoning like was done with the calendar:

> Second, in the years since the First Council of Nicaea in AD 325,[b] the excess leap days introduced by the Julian algorithm had caused the calendar to drift such that the (Northern) spring equinox was occurring well before its nominal 21 March date. This date was important to the Christian churches because it is fundamental to the calculation of the date of Easter. To reinstate the association, the reform advanced the date by 10 days:[c] Thursday 4 October 1582 was followed by Friday 15 October 1582.[3]

* https://en.wikipedia.org/wiki/Gregorian_calendar

As annoying as handling a leap second could be, if it happens even somewhat regularly it can be testing more often. Deciding in the future to do a 'one-off' event may be more challenging from both a coördination point of view, as well as trying to handle a rare event correctly in (e.g.) code.


The correct answer is obvious — to stabilize the Earth's rotation — but there are a few implementation details to work out.


It would certainly be easier than fixing all of the software


If we simply detonate the moon it would cease affecting the Earth's rotation, thus solving the problem.


You've created the plot for 'Space: 2099'. Don't let Hollywood steal your idea.


Wouldn't work afaict. You'd have to get rid of the mass, not just explode it.


If you would detonate the moon, break it up into small pieces and smear them out over its orbit, tidal friction would cease. Tidal friction is a major component of the earth's rotation's slowdown.


> If you would detonate the moon, break it up into small pieces […]

Close to the plot of a recent Stephenson sci-fi novel:

* https://en.wikipedia.org/wiki/Seveneves


I hated the fact that you had to get 9/10 through the book to find out how it derived its name. The number 7 appears right at the start, but it's a misdirection.


I got about halfway through before I started skimming. Interminable.


> Wouldn't work afaict. You'd have to get rid of the mass, not just explode it.

I'm sure kibwen had in mind a Hollywood explosion, whereby a detonated object ceases to exist, rather than turning into many smaller objects.


“Sir, are you suggesting that we blow up the moon?”


Nope, just explosive rapid deconstruction.


But wouldn't the rotational speed of the earth become more erratic?


Indeed, but all life would be extinguished by the ensuing orbital bombardment, thus solving the problem.


Freeman Dyson figured out how to do that along time ago, he was more interested in speeding up rotation to disassemble a planet but the same method can be applied to set the rotation to a given speed.

just build a planetary motor.

encircle the earth with ferromagnetic coils, then fire asteroids (similarly banded) at the earth in a series of near misses the magnetic drag steals or imparts angular momentum depending on the direction of the asteroid relative to the earth.


The drift is sufficiently slow that you can fix it with a "leap hour" when needed -- perhaps once every five thousand years. This could be handled for civil/local time via an adjustment in the timezone definitions (something which happens a few times a year today), leaving the underlying UTC timebase unchanged.


Then let's skip the leap hour too: just let UTC and UT1 diverge up to 86400 seconds and compensate that by changing the time zones when needed. Then, when the drift reaches 86400 seconds, the clock times are aligned again. At this point we just need to re-align the dates too, and this can be done by skipping February 29 on a leap year (which should be quite easy to implement in software).


Is that practical? Drift is slow and takes 1 min per century. Let’s say you can live with 10 minutes of drift. That leaves minutes [10-50] which is 4 centuries vs minutes [50-10] where it’s fine which is 2 centuries.

Also, if the problem were time zones, time zones already support non-whole hour shifts, so why not just apply the leap second into all time zones? The reason is that that isn’t what a leap second is. It’s kind of “how much time has elapsed since 1970 midnight” and that number is corrected for with leap seconds. Time zone offsets don’t help you here.


We actually live with about +/- 15 minutes of drift (of local solar noon vs. 12:00 local mean time) over the course of a year. See https://en.wikipedia.org/wiki/Equation_of_time

And timezones one hour wide impose about a +/- 30 minute fixed error (and sometimes larger) on top of that - the difference between local mean time and local civil time.


We have an implicit assumption for definitions of noon and midnight right now. I can't tell if that assumption will be there 1000 years later, but assuming that, UTC and UT1 can't really differ by much more than several hours.


This is the way.


> This could be handled for civil/local time via an adjustment in the timezone definitions (something which happens a few times a year today), leaving the underlying UTC timebase unchanged.

Like I said: it could become a huge coordination problem.

* https://en.wikipedia.org/wiki/Gregorian_calendar#Adoption_by...

* https://en.wikipedia.org/wiki/Adoption_of_the_Gregorian_cale...


it's really just a tzdata update for the vast majority of computer systems. That package updates several times a year. Get the revised rule in there a decade in advance and you're golden.


> it's really just a tzdata update for the vast majority of computer systems.

I don't think you understand how the tzdata works: the tzdata folks only update the contents after the civil authority for a region changes the law.

So first you have to get all the law makers to update what the legal statues say and then you "just" update the computer systems. The latter is the 'easy' part, it's the former that's the coordination problem.

And as someone who was around for the 2005 DST change, I can say that may also be quite the coordination problem. (Though I think code has gotten much better since then.)


Let them handle it in the future when it becomes an issue, in roughly 500 years (that's when the lack of leap seconds will sum up to roughly 30 min). If they still care about having the sun above London at exactly 12 (or whenever it is in ut1) they can handle it.


That's a feature, not a bug. UTC adjustments require world-wide coordination.

Timezone adjustments do not. They are fundamentally a local matter, and just need to be communicated to the tireless maintainers of tzdata, and can happen incrementally any time an area feels local time is too far off solar time.


The main difficulty is convincing Greenwich to not be utc+0 anymore.


Are we actually compensating for continental drift of the area around London already?


My understanding is that clocks in Greenwich are set to utc+1 from the last Sunday of March until the last Sunday of October. (They call it British Summer Time).


And we finished that transition in 1917. Time is hard.


The full resolution can be found here https://www.bipm.org/documents/20126/64811223/Resolutions-20....

To quote the relevant section:

> [the CGPM] decides that the maximum value for the difference (UT1-UTC) will be increased in, or before, 2035

> [CGPM requests that the ITU] propose a new maximum value for the difference (UT1-UTC) that will ensure the continuity of UTC for at least a century

I think there are a few possible interpretations of this:

- We'll readjust UTC in a century (why would you do this to yourself, please no, nobody wants this) by setting a predicted maximum that'll last 100 years

- The maximum is now 1 hour, we'll adjust clocks the same way we adjust for DST

- The maximum is infinite, UTC is now TAI + the same integral offset forever

I'm hoping for the last one, but who knows. They've once again kicked the can down the road to the next 2026 meeting to decide what the increase in max UT1/UTC difference will look like.


Why use leap seconds at all?

Our current calendar was introduced in 1582, 440 years ago [0].

We add a leap second about every 1.5 years [1].

That means in the time since our calendar was invented, we've added less than 5 minutes to our time.

Would anyone notice if noon arrived 5 minutes earlier over the course of 500 years? Especially since the position of the sun varies orders of magnitude more than that simply based on season?

Maybe we could all just agree to add 10 minutes in 3022. If we haven't switched calendars again.

[0] https://en.wikipedia.org/wiki/Gregorian_calendar [1] https://www.timeanddate.com/time/leapseconds.html


How is this supposed to work? The rotation of the earth is not a constant. So we either scrap UT1 ("the atomic clock") or UTC ("the calendar"). Neither sound like an actual option. The latter would imply that at some (far off) point in the future you wake up at like 10pm as UTC and UT1 are now entirely mismatched (you may as well get rid of all time zones at that point).


TAI - atomic clock, ignores earth rotation.

UT1 - based on Earth's rotation only, strictly 86400 seconds per day; length of each second varies; takes a heck of a lot of effort (and time) to measure accurately.

UTC - has same length of a second as TAI, but (for now) tracks UT1 to a precision of +/- 1 second. To achieve that, can have days that are 86399, or 86400, or 86401 seconds long.

None of 3 is planned for scrapping. The only change discussed in TFA is to fix UTC day to 86400 seconds (at cost of letting it drift further away from UT1).


Oh right, I confused the terminology, but the point still stands mostly.


It's a minor problem. One minute off in a hundred years is something we can all live with.


I'm not saying that we can't, but saying that we can seems quite brave (and by "brave" I mean foolishly reckless) without actually trying it. I, for one, am not so sure. There is a lot of stupid stuff about calendars and clocks (of which DST is worst by far), but at least nothing is really drifting.

Is it important that midnight is actually midnight? I don't know. I would prefer it to be, but perhaps I can just update my knowledge on when the midnight actually is once or twice a year. Perhaps it isn't a big deal at all. But I wouldn't dare to claim that "surely there can be no problems with it".

And, yes, that kinda means I don't support the leap second removal. It wasn't great, but eventually fixing the software written by incompetent people sounds more like a solution, than breaking the existing solution and making both competent and incompetent people alike eventually come up with a new one (to abolish it as well after 100 more years).


> Is it important that midnight is actually midnight?

Midnight is almost never at midnight.

Let's talk noon, specifically solar noon, as it is easier to visualize. Solar noon varies throughout the year, and based on the position in the timezone.

Solar noon for two cities in Eastern Time Zone: 11:29:16 - Boston 12:22:36 - Atlanta

Tomorrow, solar noon in Atlanta will be 12:22:50, 14 seconds later than today. No one notices a 14 second shift from one day to the next. As you can see, solar noon is almost never at clock noon, yet no one complains about that. Who would notice a 1 minute shift over 100 years?

The programming problem is made vastly more difficult because the leap seconds are not known in advance. When a new leap second is announced, every dependent system has to be updated.


Can GPS handle this? Or is it under a different time standard?


GPS has its own time thing, kinda fixed offset from TAI iirc


The article mentioned GPS uses UT1


I propose waiting a thousand years, so that the difference builds up to an hour, and then handle the adjustment as part of the daylight savings reform, which should just about have worked its way through the U.S. Congress and the EU institutions by then.


No. You’d need to actually change all the calendars. for example, an event recorded as happening at UTC 0 doesn’t get changed by a time zone change. And yet the hour you’ve accumulated is a “real” hour that’s accumulated in the number of seconds you think has elapsed since 0. You could of course keep track of when those leap seconds applied but now you have to have an almanac to compute the difference between two time stamps to account for all the smearing.


In my opinion, leap seconds are accumulated so slowly that the impact of abolishing leap seconds won't be noticeable in daily life for a couple hundred years.

As the article said, many countries jump forwards and backwards an entire hour twice a year and after the adaptation period we don't notice


> The CGPM — which also oversees the international system of units (SI) — has proposed that no leap second should be added for at least a century, allowing UT1 and UTC to slide out of sync by about 1 minute.


So there are two things to note. First is that it takes a long time for the seconds to add up, and second is that when we're looking into the deep future the current system of leap seconds won't be enough even if we do keep them.


> The latter would imply that at some (far off) point in the future you wake up at like 10pm as UTC and UT1 are now entirely mismatched (you may as well get rid of all time zones at that point).

We (or our descendants) can try to fix Earth rotation. All is needed is a big enough rotating mass with a variable speed of rotation.


UTC is the atomic clock (TAI minus 37 seconds currently); UT1 is "Earth time".

We would use UTC. UT1 would end up only being used by astronomers and whoever else cares about the Earth's rotation.

By the time UTC and UT1 diverge enough to matter, humanity will have either destroyed itself or come up with a new time standard.


If it gets too bad we can just leap-hour back onto schedule, way easier. (alter the tables that hold the existing offsets - this is done all the time)


I don't understand why this is necessary. We already have TAI (international atomic time) which is just UTC without leap seconds. It sounds like this committee voted to stop adding leap seconds to UTC, but not to "reset" the leap seconds that have already been added, effectively cementing a constant difference between UTC and TAI. What is the point?

Anybody who cares about leap seconds should have just been using TAI all along instead of UTC anyway.


It's effectively a way to migrate to TAI. We're going to end up with UTC and TAI having a consistent offset and with no expectation of UTC trying to account for the earths rotation. That's a good start. It makes the UTC and TAI conversion trivial and consistent. Much easier to migrate at that point.


The committee still wants UTC to be periodically adjusted to match the Earth's rotation. They just want to adjust it less often, where "less often" probably rounds to "never" in practical terms.


Huh, so they want to go from small, frequent anomalies to large, infrequent anomalies. I think "never" is going to be exactly accurate; once we reach 1 minute of offset from astronomical time, everyone will be too afraid to adjust UTC because we'll have had a couple centuries of complacent software written in the meantime. Got it!


> everyone will be too afraid to adjust UTC

Sounds like that is a benefit.

Lets just not adjust UTC. Problem solved.


That makes UTC just a shittier version of TAI. No problem has been solved, because TAI already exists.


A leap minute once per ~century probably amortizes to a lot less effort. A leap minute is a bigger deal to implement, but you do it a lot less often. And the sun being off by up to a minute isn't a huge deal for humans


Hot take: Leap seconds caused by astronomers refusing to modify their own software, instead getting the rest of the world to modify theirs. Too facile?


Yup, too facile.

Astronomy commonly uses TAI or raw GPS time. In fact if you look at video footage from NASA control rooms and such there's usually a GPS second clock up on the wall somewhere.

The motivation for leap seconds wasn't astronomers, but to just keep civil time tied to solar time long term. However, the unpredictability of these leap second additions has proven to be pretty annoying, causing bugs and such. This is why google and others actually "smear" the introduction of their leap seconds over half the day.

Considering the current difference is 37 seconds, its natural to wonder if this is worth it. Certainly most people wouldn't notice relative to dawn dusk for a very long time, long enough that the entire concept probably wouldn't even make sense anymore. So why not just stop? That's the basic argument here.


Doesn't astronomy already use their own time reference standard?

It seems like JPL (NASA) and the scientific community have defined two currently used sets of time standards for observations from Earth and from space near Earth. https://en.wikipedia.org/wiki/Time_standard#Time_standards_f... Barycentric Coordinate Time (TCB) and Geocentric Coordinate Time (TCG) with DE430 as the current revision of their standard https://en.wikipedia.org/wiki/Jet_Propulsion_Laboratory_Deve...


Very much so.

Look at astronomical software, e.g. xephem. There will be a "clocks" tab that displays a number of different clocks, TAI (atomic clock, no leap seconds), UTC (global civil time), media solar time (GMT, which ISN'T UTC) and finally, derived from the aforementioned "sidereal time", which is the one you really need to adjust your telescope. Sidereal time is derived from a year with 1 more day basically, because the earth moving around the sun adds 1 more rotation of the background stars. Which is a drift of roughly 4 minutes per day.

https://en.wikipedia.org/wiki/Sidereal_time

Oh, and then there is stuff like Julian date which you need to look up the myriads of catalogues and tables you need for corrections because everything "wobbles" even more than you'd think.

Yes, dropping leap seconds will remove 1 table lookup from the above. But astronomical time systems are so complex that that change is a drop in the ocean.


(insert link to canonical "standards" xkcd)


Nah. Its common for astronomy to use GPS time and then apply the corrections from BIPM Circular T to get UT1.

For sophisticated astronomy systems the elimination of leap seconds should simplify and reduce a source of errors. As it is, since Circular T is against UTC you need to take your local source of time and worry if leap seconds have been correctly applied (apply leap seconds to gps time or make sure haven't been fed smeared NTP time or other horrors) before you can get UT1.

Fairly casual astronomy, no-- but that isn't driving international standards, and there mechanical uncertainties dominate so you'll end up doing a 1-star correction in any case which immediately corrects the clock.


More like lazy programmers want everyone to change basic timekeeping practices just so they don't have to fix their code.


Any modern positioning system needs an accurate clock. Leap seconds aren’t completely predictable, so you can’t deploy something like a GPS unit with future leap seconds encoded. These computers are often spread across vast distances, so it’s the most difficult thing to possibly update.


Wouldn't a positioning system just need accurate time differences and synchronization between transmitters?

E.g., suppose you made a system where each satellite transmits messages of the form "Here comes beep #N. At the time of beep #N this satellite is at this position: <X>, <Y>, <Z>. Here is beep <N>: BEEP!".

The satellites are synced so that beep <N> occurs at the same time on all of them.

By looking at the time difference between the arrival of beep <N> from several satellites a receiver could tell the differences between its distances from those satellites. Combined with the position information for those satellites at the time they beeped the receiver would be able to tell its location.

If the satellite broadcasts use a high enough frequency and that frequency is known and very stable the receiver would not even need an accurate stand alone timer. It could sync a counter to the frequency of the satellite radio.


Google does "leap smearing" which seems like the best human solution to this problem: https://googleblog.blogspot.com/2011/09/time-technology-and-...

standardizing leap smearing algos and constants could work

-Bottom layer is atomic clock seconds -We define targeted relationship between current UTC and atomic counter that will occur on a given day and time X -Time is interpolated to drift UTC into place by the given day and time X -Standards body can adjust time on some regular basis by its relationship to the atomic clock and publish the algo to convert from atomic to UTC


Leap smearing is awful except for rough synchronization operations that don't care if you redefine a second, and even there it isn't great even if you know the exact smearing formula because you can't easily tell if the data you're getting is pre-smeared or post-smeared, or should be smeared but isn't, and then you end double smearing or inverse smearing by accident.


Maybe it’s technologically difficult from a standards perspective, but it works well for humans. If the computer smears an extra second into my day, I won’t notice.


Hurrah!

Each leap second event causes hundred of millions of dollars worth of disruption and that's not including the disruption created by leapseconds even when they're not happening (e.g. the frequent false leap seconds) or the mini-disaster we're sure to experience should there be a negative leap second (which we are still trending towards).

The delays are unfortunate because it's harder to transition applications that need UT1 to use an offset from UTC when the available time sources are still unpredictably and unreliably leaping on you (since to apply a UT1 correction you need your UT1 offset and your UTC source to agree if and how a leap second has been applied).

From a practical perspective it would be better to immediately discontinue leaping, then UTC would immediately become a stable time that adjustments could be applied against for those few applications that need them. It would also save us from a negative leap second.


So we make things worse for humans in order to make it easier for computers?

Yah, one second doesn't matter, but it builds up.

This: "Or we could even decouple our sense of time from the Sun entirely, to create a single world time zone in which different countries see the Sun overhead at different times of day or night."

Shows that they are completely disconnect from human reality: "Science already doesn’t use local times, we talk in UTC."

That's great for science, but people care about day vs night.


> Yah, one second doesn't matter, but it builds up.

I don't think it's that big of a deal; do you really care what the perception of "11 in the morning" is for someone 1,000 years ago? This kind of thing is pretty cultural anyway, and a slow drift over a thousands of years doesn't really matter.

The main reason we have the "new" Gregorian calendar is because of religious reasons, not because people were having huge practical problems with the old (slightly less accurate) Julian calendar.

Plus the current leap second system won't really deal with the long-term drift anyway because the earth's rotation keeps slowing, so in a few hundred years we'd need more leap seconds than the current system allows, and eventually we'd need a "leap second" every day because the day is a second longer (around the year 6000 IIRC).


Religion was only half the reason for the Gregorian calendar reform. Church and state, being interlinked and intertwined, the civil calendar was gradually slipping away from correct calibration with the equinoxes and thus the seasons. So it was important for farmers and mariners and politicians to all have a reliable calendar that matched what the Earth was actually going through.

It just so happened that the religious authority in the West was able to make a significant change to the civil calendar which also happened to match the Church calendar, and it was adopted by all of Christendom, eventually even Asia and the four corners of the Earth have accepted it as universal and not so religious.


That was I was told in history lesson, too, but I'm not so sure that was all that important of a reason, otherwise it wouldn't have taken hundreds of years for people to adopt the new calendar. The drift is so slow that a week more or less over a period of 1,000 years is easily something that can slowly be adjusted to by farmers, politicians, and the like (unlike, say, the Roman calendar before Caesar reformed it which drifted by several days every year).


> one second doesn't matter, but it builds up.

Less than you think. A part of the leap second is correcting an unpredictable random walk. The random walk part does not add up.

The linear drift part does add up. We're still talking several thousand years to end up with just an hour offset.

Just switching to a new timezone every few thousand years ("As of Jan 1st 6022, fifty years from now, all usage of Eastern timezone will switch to New Eastern timezone, which is an hour ahead.") would handle your civil usage concern fine.

(and then in year 10022 they can switch to New New Eastern timezone, if there are any survivors of world war 5...)


I agree with you quite a bit here, although I'm of the persuasion that this one minute a century drift is nothing, you already deal with orders of magnitude worse when you go visit family in another time zone for example, and you're fine.

I think this goes along the lines of the great DST debate, the US is going to do away with the changes next year and make DST permanent. Not do away with it, keep it, forever.

Fundamentally we create these abstractions and begin to rely on them and then decouple how we operate from the real world. We created the clocks to measure the position of the sun in the sky, now we ignore the sun entirely, the clocks are god now. I don't think it's a good thing generally speaking, but on this particular issue, I don't think people are going to really feel anything different happening, it will decrease the engineering burden to constantly maintain and update systems, which means decreased resource consumption, with no noticeable impact on the day to day lives of anyone except those engineers relying on exact time measurement.


Because we have to distinguish the two concepts. With timezones we already did it: computers save time (usually) in UTC format, and time is converted to the user time zone only when displaying it (at an higher level).

This is obviously better, when you save a file for example you don't care about the region of the user, and thus the save timestamp of the file should really be an absolute number, so if you take your laptop and go from one country to another your whole system doesn't break because it sees timestamps in the future, same thing when daylight saving is applied and the clock is brought back/forward by an hour.

Leap seconds, if we decide that we should keep them (to me not, because the difference is so little that we would start to notice that in centuries, and who knows not which computer systems we will have but if humanity still exists), should really be handled at time zone level, by shifting of an offset that contemplates leap seconds, and not by slowing up/accelerating clocks.


Solar noon, or the solar meridian, is the time of day in which the sun is highest in the sky. If our times were completely natural, solar noon would be at 12 PM exactly.

Today, the solar meridian in Madrid is at 12:59 PM, while the solar meridian in Belgrade is at 11:23 AM. This is because Madrid and Belgrade are within the same time zone, in order to more easily coordinate European commerce.

If that's an acceptable tradeoff, I don't really see the issue in drifting something on the order of dozens-to-hundreds of seconds per century.


I struggle to understand how we benefit that much from the same wall clock times mapping roughly to the same times of day across time zones. It seems like there's pros and cons to both ways and there's no clear "better" option when it comes to the human experience. Meanwhile, doing away with timezones also does away with a lot of complexity both in computing and other areas (e.g. scheduling across timezones)


This will help you understand: https://qntm.org/abolish


It's (less than) a minute a century -- why would we not simply say "no corrections except at the century mark, in the middle of the weekend closest to January 31st"? That way:

There's an exact time everyone knows the correction will happen, it's just a question of how big the correction will be.

The correction is always in the same direction -- no "one second forward, then later one second back" shenanigans.

It always happens over a weekend, so no one has to deal with real-time work-time issues.

While maybe some have to work a weekend, at least it's not near the year-end holidays.

It only happens once every 100 years.


> Or we could even decouple our sense of time from the Sun entirely, to create a single world time zone in which different countries see the Sun overhead at different times of day or night.

A very interesting idea, but probably much too progressive.


I mean, if chaos is the goal, then sure... Let me refer you to: https://qntm.org/abolish


> Do normal humans publish "waking hours"? Not typically.

I believe I just fell in love with that idea. If this would become a social norm people would maybe stop trying to reach me in the morning.

And I would know if it's too late to call others.


This can indeed happen when a significant portion of humanity lives in the space, and can trigger the biggest change to how we keep date and time.


No need to wait that long - plenty of incoming email has "My working days are: " in their footer. I'm sure time can be added there too


I'm for this. Get rid of timezones, AM, PM, DST, leap years, seconds, all of it! Imagine having to know just one time. It takes a bit to wrap your head around. But only because we've accepted all the confusion for so long. The sun is still going to be "up" in one part of the world and "down" in another. The little numbers on your clock don't change that. Why can't it be the same little numbers everywhere?


At that point you may as well go with beats: https://en.wikipedia.org/wiki/Swatch_Internet_Time


I'm for that as well. Some minor issues that I've thought of. Currently the day changes (Mon -> Tue) at midnight local time. If the day followed UTC, then would California change from Mon->Tues at midnight UTC, which is 4pm currently?


How do you know whether the sun is up or down in a place when all places have the same time?


I see people bringing this question up as a counter-argument for a unified time.

But how do you know whether the sun is up or down in any given place on Earth right now? You probably don't, if the place is a few thousand miles East or West from where you are. You'd have to look up its time zone. And doing that might not be that much different from looking up at what hour the solar noon is somewhere else.


You could do that, sure. Whenever I want to schedule a meeting with someone in Zurich, I could reason “15:00 UTC is about an hour before solar noon for me and — looking it up — about four hours past for him. Reasonable time for a meeting.” This would prove so useful in practice that people would rapidly start referring to times that way when talking to people from abroad: “sorry I couldn’t take your call, it was seven hours past solar noon for me; I was having dinner.” Everyone would memorize their location’s offset from UTC and use it colloquially all the time.

Oh wait, we’ve reinvented time zones!

Basically, the point of time measured in hours and minutes is to be useful to humans in daily life. And it’s useful to have a common language for things relevant to humans.


It’s fundamentally the same lookup.


And measure time in powers of ten while we’re at it. None of this 24 x 60 x 60 nonsense.


I've wanted to do this for years. Every time I mention this to someone I get a surprisingly visceral reaction against it even when they can't think of any real reasons why this would be bad.


I absolutely do not believe that nobody has given you a coherent list of reasons.


I didn't say no one ever has, just that there's a very strong negative reaction even when they don't have reasoning for it


Do they really not have a reason or do they not want to put in 15 minutes of effort to explain something that has already been explained a thousand times?


I've heard of the endless disruption that leap seconds cause for years and only now have I thought to ask the question: what was so important that the entire world needed to add or subtract individual seconds from the calendar? Seems like you'd need a pretty big justification for something like that, but I only here the horror stories, not what the leap second was supposed to actually solve.


Mostly due to tidal friction from the moon, Earth’s rotation is slowing down in the long term. If we don’t adjust our clocks, the difference between the solar day and our clocks will slowly grow, quadratically. See this site for more details and nice graphs: https://www.ucolick.org/~sla/leapsecs/

In 1972 it was decided that the best way to compensate for this is to insert (and sometime remove) leap seconds, so that the difference between UTC and Earth’s rotation is kept below one second. A certain subset of astronomical and navigational/satellite applications rely on that condition being true. If this is changed, some decades-old systems, some of which may be critical infrastructure, may have to be substantially modified to account for leap seconds in some other way. The mention of GLONAS in the article is one such example.

In the international standard body responsible for UTC (ITU-R), there was up to now no sufficient majority in favor of abolishing leap seconds, due to those concerns. (Never change a running system, so to speak.) By now it has become apparent that the benefits of dropping leap seconds should vastly outweigh the potential drawbacks, at least for the next few decades. But it took some time for that realization, and probably also some older participants whose minds couldn’t be changed to die off.


>If this is changed, some decades-old systems, some of which may be critical infrastructure, may have to be substantially modified to account for leap seconds in some other way. The mention of GLONAS in the article is one such example.

GLONAS and other GPS like systems already have to deal with the temporal effects of special AND general relativity in their time keeping. Pushing leap seconds off on to the already special case system seems like a fair trade for making every other programmer and sysadmins life easier. let the extreme corner cases deal with their problems not everyone else.


Fair enough. I guess in those days, there weren't as many systems that depended on the assumption of time lurching forward one second at a time.


Few systems actually should need to depend on the clock tick matching actual elapsed seconds, and those who do depend on accurate elapsed time usually only do so short-term (while the program is executing) and don’t need to correlate that time with calendar dates and time-of-day. Most applications could very well live with an intermittently slowed-down time-of-day clock upon leap-second insertion, or a fast-forward upon leap-second removal, and use a different API when they’re interested in accurate elapsed time. But APIs have conflated those concerns, increasing the likelihood for something to grow when e.g. unix time diverges from accurate elapsed time.

I like to think that applications should be written under the assumption that the computer could be used by time-travellers, with the clock reflecting the respective changes to “local” time. That is, you would have calendar dates and wall-clock times reflecting the time travel, plus an elapsed-time clock reflecting the CPU’s subjective time. Then design your systems around that. As a consequence, they should also be immune to leap seconds and DST switches. But I guess that’s too much to ask. ;)


s/grow/go wrong/


...but UTC is still offset from TAI by 37 seconds. Any plans to do anything about that, I wonder?


Once leap seconds actually stop TAI-UTC will presumably just have a constant offset. Constants are kind of irrelevant since they are very easy to deal with. Really no different than GPS time being exactly 19 seconds off TAI.


I wonder: Why is Daylight Savings/Normal Time not implemented using monotically increasing hours? Day goes up to hour 23 one time of year and hour 25 the other instead of doing a clock hour twice. Would be weird to be able to have 23-hour and 25-hour days. But this seems to be how leap seconds are handled (one 61 second minute)?


That could probably work, but I'm glad that is handled by changing the timezone instead. Otherwise inventors of unix and ntp times would have got the bright idea to rewind or forward those times by a full hour twice a year.


The first real program I wrote was a forum in perl. But I didn't want to use "advanced" features like "use" (perl's word for importing a module). It just seemed too magical.

So, I had limited time manipulation capabilities. I tried to write all the date handling stuff by hand (with a lot of "if" statements for the special cases). I recall trying to make leap seconds work, but not sure if I actually did. It worked well enough for my purposes.

Also, no database, so I did it all with flat files. Worked better than you might expect (thanks to built-in flock()), but I wouldn't recommend it.


The same CGPM 2022 conference also resolved to give standard prefix names for 10 to the 27 and 30:

    power   prefix   symbol
    10^27   ronna    R
    10^−27  ronto    r
    10^30   quetta   Q
    10^−30  quecto   q
c.f. https://www.bipm.org/documents/20126/64811223/Resolutions-20... (Resolution 3, English version on page 23)


We have a GPS time server (so stratum one) that is set to GPS time, so it's off from UTC by almost twenty seconds because they don't ever apply leap seconds. I love it. Another part of the org pressed me to correct my time to match UTC, but I'm so happy to be done dealing with time changes that I just replied "monotonic time" until they got their own. A few grand to not have those problems is such a bargain!


I never really understood why we need leap seconds. Or better: why we need to bother with them in a computer system at lower level.

If we decide that we absolutely need to keep our time in sync with the rotation of the earth, really what should be done is define a timezone with all the leap seconds applied, and use that timezone to only display it to the end user. Not change the way we sync computer clocks for no reason! NTP shouldn't contemplate leap seconds, for example...


We don't. We were over achieving when they were established.

The cost/benefit at the time didn't look so bad because the world wasn't full of distributed synchronized computer systems, so the added 'cost' of leap seconds was just some make work for geeks in national timing labs.

The cost benefit is very different today.

We can keep civil time roughly aligned with the sun by moving timezones an hour every four to five thousand years.

Applications that want to give accurate sidereal time or very accurate sun-up sun-down can use predictions of UT1. Bonus: it's a lot easier to give an accurate UT1 when you don't have to worry that leapsecond (mis)handling has screwed up your underlying clock.

That's another part of the cost model that has changed: When leap seconds were created it would have been burdensome to carry around an additional offset in time transmissions for those few applications that want a more accurate mean solar time. But today its fairly easy.


There has been a proposal in the 1950s to the UN by a German mathematician to replace the current calendar by a decimal-based system, in which leap YEARS are not needed either: everything would be divisible by 100. I can't remember where I heard this from, but the anecdote goes he got a reply saying thanks for the proposal, but it is not feasible to introduce such a massive change globally, irrespective of the proposed improvements.


That is mixing up quite a few different things. World Calendar [1] was proposed in League of Nations/United Nations, but it was created by US person. There was slightly earlier proposal, International Fixed Calendar[2] from British person that also had some popularity. Neither of these were decimal calendars though, that is something from French revolution [3], and even they did not really manage to make it very decimal.

Afaik the most vocal opposition to reforms came from the US

[1] https://en.wikipedia.org/wiki/World_Calendar

[2] https://en.wikipedia.org/wiki/International_Fixed_Calendar

[3] https://en.wikipedia.org/wiki/French_Republican_calendar


I'm' baffled by the section about GLONASS. Russia surely has the ability to decide whether they add or remove future leap seconds?


In prior advocacy for the elimination of leap seconds, I found that many of the officials appointed to talk about this stuff are fairly clueless about the engineering.

I wouldn't be too shocked if they asked "what uses leap seconds", got an answer, and are advocating on that basis ("We have systems that use leap seconds!"), without having actually asked better questions like "What would require expensive changes if no more leap seconds were issued?"


Of course they do. But government bodies and other institutional bodies incarnated these standards organizations in order to better coordinate with one another, and decided and committed to defering to them a long time ago. Russia is a sovereign state, they can make any decision they want to, but they decided and agreed at some point to defer to this organization for a reason.


> Russia is a sovereign state, they can make any decision they want to, but they decided and agreed at some point to defer to this organization for a reason.

I think the commenter means that there isn’t a technical reason that GLONASS can’t be changed to not add leap seconds not that Russia doesn’t have to defer to an international body.


> Leap seconds aren’t predictable, because they depend on to Earth’s natural rotation.

This is surprising to me. Is the Earth's rotation so arbitrary?


Yes it varies unpredictably based on changes in the distribution of mass within the Earth from magma flows, air currents, ice pack movements, ocean currents, and so on.

https://en.wikipedia.org/wiki/Day_length_fluctuations


More like the precision of ± 0.5 seconds / year, that is to say 15 parts per billion, is that ludicrous. For comparison, time acceleration in geostationary orbit due to general relativity is somewhat less than one part per billion. Household tools (other than clocks) rarely go beyond a percent, and if you need a particular quantity you don’t have a ready-made tool for you’ll likely going to be hard-pressed to go beyond ten percent.

The more precise you want to be, the more complex and numerous physical phenomena you need to consider become; with a system as involved as the Earth, at some point you get to weather-like chaotic behaviour, so no practical amount of additional precision in input data helps anymore.


Nearly all things in nature spite our systems.

The earth even speeds up and slows down in response to stuff like earthquakes and volcanoes.


It doesn't help that the atomic second was defined using old data such that when it was defined we were already relatively far off from 86400 seconds/day. A relatively large part of the correction being applied by leap seconds comes from this initial offset, which is also why all leap seconds so far have been in one direction. (and why the handling of the other direction is essentially completely untested, making the prospects of a negative leap second in near-ish future pretty frightening).


Basically, things like the weather can influence the speed of rotation. Angular momentum changes when mass is closer to the center of gravity or farther out. That measurably includes mass like water in clouds and leafs on trees.


It would surprise me if we could not measure and detect any variation in the earth’s rotation. A leap second is on the order of less than 0.000001% variation, at least according to the leap seconds applied recently.


They suggest world wide UTC, saying that the hours of noon, sunset, and sunrise don't really matter. But the real problem of UTC around Australia isn't the hours but the date.

It is really convenient that the date changes whilst we all sleep. It makes 'today' and 'tomorrow' weird.


Man, between this and the Ronna and the Quetta, it feels like science is getting its admin done today!


That’s because both came from the 2022 General Conference on Weights and Measures of the BIPM.

https://www.bipm.org/documents/20126/64811223/Resolutions-20...


Well Quetta gets in the news for different reasons for once


There was talk of REMOVING a second soon because the Earth's rotation had sped up a bit. Physicists havent figured out the cause of the speed up. The previous slow downs were attributed to tidal friction and global warming (expanding seas and atmosphere).

https://www.forbes.com/sites/jamiecartereurope/2022/08/03/do...


Can anyone explain what the alternative is? Surely we won’t just get more and more out of synch with the earths orbit?

Also why does it say the earth is slowing down but this year it sped up. Sounds quite impossible?


The slowdown is the long-term trend caused by drag of the Moon's gravity, but the speedups are sporadic and as a result of events such as massive earthquakes that cause the Earth's rotation to speed up as some significant mass falls further to the Earth's center. The conservation of angular momentum is the principle that leads to those brief speed ups, but they will not fully counteract the longterm drag caused by the Moon.

Edit: TLDR; Local maxima vs. overall trend.


>Surely we won’t just get more and more out of synch with the earths orbit?

Why does being out of sync with the Earth's orbit matter?


Because humans use time as an indicator for day and night


We already accept an error of half an hour (most timezone offsets are multiples of a whole hour) or more (many timezone boundaries are widened to align with state or country boundaries). A couple of minutes is nothing next to that, and if start getting too far, we can simply redefine the timezone offsets (which we already do twice a year in many places, and any device which might be moved across timezone boundaries already have to deal with that).


Also, historically, we did care about the equinoxes and solstices falling on specific days of the calendar.

We may not care anymore, but emphasis on may.


And you can still use UTC as your indicator for day and night. In 100 years, it would only drift around 1 minute.


We don't really do that on a second accuracy though. We need to add a whole new day to the calendar every four years to make up the difference. Meanwhile we've added less than half a minute worth of leap seconds.


Most systems rely on network time from a canonical time source. Wouldn't it be enough to do the leap second smearing/adjustment there to free the end systems from dealing with that?


I love how "the world" voted to end the leap second, before actually coming up with how to deal with scientific desync (Real world: irrelevant. Space science: kinda important)


I trust we will also be getting rid of leap years: that whole pausing the calendar for a day thing is very confusing.

Oh wait, that's not how it works. And neither is it how leap seconds work.


Those are very different scenarios, they aren't equivalent.

Leap years have to do directly with the sun, what day and time the equinoxes happen every year and the like. This time noticeably drifts every year, even every day, computers or not.

These leap seconds and what not have to do with very sensitive instruments measuring time very precisely. These time measures are entirely about machines.


That's simply not true at all, and that was my whole point. Leap seconds have nothing to do with precise measurement because they are nothing more than calendar adjustments, just like leap days. They have the same purpose too; leap seconds keep sunrise and sunset in the right place over long periods of time just like leap years keep equinox and solstice in the right place. Note that leap years, an invention millennia old, account for drift on millennial scales, with the 400 year rule for example. This disproves the theory that there is some modern obsession with exactitude which sets leap years and leap seconds apart, because leap seconds too deal with significant drift on the same scales.

Only the madness of "smearing seconds" and other workarounds for broken software with incorrect calendar implementations makes them seem different.


Note that what the article actually says is that there will be no more leap seconds starting in 2035. There could easily still be more before then.


Imagine how we would feel if we adjusted lat-long definitions instead of smearing or leaping time in order to adjust for the slowing down of the rotation of the earth.

Does that sound bananas to you?

Of course it is. That's what we're doing with leap seconds. Only some people smear and some people leap and they don't even do it over the same time or in perfect synchronicity.

Just ditch the leap seconds. They are not worth the cost.


Suggestion:

Move what is leapsecond independent to UT1, the atomic clock basis -- google, finance, etc now happy

Keep UTC etc as is, since offests happen all the time to every other human clock anyway for political or summer time or country boundary changes anyway -- everyone else happy

And calculating UTC as offests from the consistent base of UT1 sounds like the way to do it


And yet we still have winter and summer time. Is this also a problem? With different rules for each country.


I didn’t vote for this.


Is this going to make dealing with time/time-zones/etc even more difficult in code now?


Well that didn't long, did it? It was introduced in 1972.

In years to come historians will be having a good old chortle about how we managed to come up with two times that were 37 seconds apart.

In retrospect, fiddling with computer clocks like that was bound to be a nightmare.


I've never really understood why smearing the leap second is such a big deal. Surely if you have software that is going to be sensitive to small variation in the clock over 24h, you're already not using wall time.


I wonder if the idea of giving an irregular offset at regular intervals is in the cards. Having a unknown offset of x seconds every century, say, seems easier to implement than a known offset every unknown interval.


It seems odd to me that they decision was to not solve it this way, but not how it will be solved going forward. Like, doesn't that effectively make everything more complicated/difficult?


As another commenter said, we don't get enough leap seconds for it to have a noticeable effect in real life. The solar sync (in this time scale) is really only relevant for space science and they probably need a better time than "adjusting in discrete 1-second-steps" anyways and are much more limited in scope.


whoopie. Can we get rid of daylight savings time? That one actually kills people

https://www.newscientist.com/article/2344401-annual-us-clock....


The earth sped up slightly the last couple years after only slowing down for a long time.

The great leap-minute crisis of 2147 will be interesting.


Great, now if we could also stop using GMT in the UK and stick to British summer time year round that would be great


Does anyone find it weird that the General Conference of Weights and Measures has acronym CGWM instead of GCWM?


It's CGPM, from French "Conférence générale des poids et mesures"


This is also fairly common on international bodies' initialisms for which there are multiple official languages, to not favour any one of them.

"ISO" is the International Organization for Standardization, in English, Organisation internationale de normalisation in French, and Международная организация по стандартизации in Russian, its three official languages, as one fairly well-known example.


Yay! Another condition to take into account when working with time series and sensitive data across timezones!


Let's fix the root of the problem and make Earth's rotation constant. How hard could it be? /s


Wait until you hear about calendrical variability on Venus.


Why would Russia need 17 years to modify their satellites?


Now collectively do the same with daylight saving


We're going to regret this in 10,000 years.


Just use variable speed seconds.


The problem only arose when the second was changed from being 1/86400th of a solar day to being defined by atomic clocks. The old definition worked well for civil purposes but was by the 1950s impossible to accurately co-ordinate between researchers.


We have to make adjustments when coordinating anyway since local time anywhere outside sea level needs relativistic corrections to derive global time, so might as well also periodically introduce another adjustment factor based on changes in the rotation period of the earth…


They're just doing this so Twitter won't break, aren't they?


I didn't get a vote. Did you?


wait, does this mean there will be no more leap day year starting from 2035?


How does the world vote?


Did you read TFA? It's literally the first line of the second paragraph.

The decision was made by representatives from governments worldwide at the General Conference on Weights and Measures (CGPM) outside Paris on 18 November.


So not the world? Title is misleading.


Nobody is confused by this, except those who are trying to be.


I am not confused, nor trying to be. It is a factual statement.


I would only want qualified people and not laymen voting on this. It's technical, not social. It has little bearing on anyone's lived experiences apart from scientists and engineers.


It's a totally valid synecdoche. "The world votes..." doesn't mean "every human on earth votes...", it means "a worldwide body votes...".


Quite, although even “every human on earth votes” isn’t the most literal interpretation: the planet becoming sentient and itself voting would be.

Of course, that’s an absurd interpretation.


It is "a valid synecdoche" only as long as we accept that doublethink and newspeak are desirable. Sure, calling black — "white", dictatorship — "worldwide democracy" and so on… everyone will adjust. And by "will" I mean "already do and always did" — there's nothing new about this, it's just the "truths" we are supposed to believe in are what changes over the centuries, not the way societies work. Perhaps it's even true that there is no other way (which doesn't make me like it any more).

So it's totally true that no one is actually mislead by this, but I absolutely understand those who try to pretend they are, and have a slight disdain for those who try to defend this bullshit.


I... uh... it's a normal phrase. Dunno know what to tell you. You're not gonna get a lot of takers on this being "bullshit". Feels like you're mad about something big and taking it out on random irrelevant things.

There's maybe an argument to be made that using language which implies the legitimacy of organizations as presenting "all of us" is a sort of supremacist way to act, because it casually legitimizes whoever happens to be in power without questioning where they got that power or whether they earned or deserve it.

But, like, it's a standards body. If the world didn't have one it would want to go and make one and then be back where we started. This isn't the interesting battlefield for that kind of point.


>representatives from governments worldwide


> The decision was made by representatives from governments worldwide at the General Conference on Weights and Measures (CGPM) outside Paris on 18 November




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: