Hacker News new | past | comments | ask | show | jobs | submit login
No leap second will be added tonight (obspm.fr)
136 points by ColinWright on June 30, 2014 | hide | past | web | favorite | 76 comments



Poul-Henning Kamp's article "The One-second War (What Time Will You Die?)" [0] is a great read. I certainly was unaware of the background, and how it can affect actual systems.

[0] http://queue.acm.org/detail.cfm?id=1967009


Another great article on the topic, from American Scientist: "The Future of Time: UTC and the Leap Second", available either from the magazine (behind paywall): http://www.americanscientist.org/issues/feature/2011/4/the-f... or on arXiv.org: http://arxiv.org/ftp/arxiv/papers/1106/1106.3141.pdf .


I'm not sure about the etiquette of arXiv links, but I have always preferred being taken to the abstract, so that I can decide whether I want the paper: http://arxiv.org/abs/1106.3141. It's just a click from there to the PDF (or other formats, if there are any).


That's a great article. Thanks for linking it.


I'm getting a 500-server error


I did not know that the speed of the Earth's rotation is affected by earthquakes[0] and the weather[1], and thus the need for a leap second.

0:http://en.wikipedia.org/wiki/Earth's_rotation#Changes_in_rot...

1:http://en.wikipedia.org/wiki/Fluctuations_in_the_length_of_d...


Another important effect is the transfer of angular momentum of the Earth to orbital momentum of the Moon by tidal forces.


A fun, related read on what Google does when they must introduce a leap second to their fleet of servers: http://googleblog.blogspot.com/2011/09/time-technology-and-l...


Or, you could just run your computers wall display clock on UT1, which is effectively UTC with the leap seconds "smeared" over half a year. The deviation from the UT1 second to the UTC second is less than the precision most computer's clocks.


Interesting that they don't keep them in TAI. I would expect having the hw clock of the PC be in TAI to be the most pain-free option. Or is that not possible?


There's very little OS- or application-level support for it. Adding support would probably take more effort than convincing governments to abolish leap seconds.


Abolishing leap seconds is a terrible option. In a few thousand years' time, official time will have shifted noticeably from solar time, which is not acceptable to society. It's just code. We should draw up a transition plan to using TAI in computers.


In software engineering, batching changes is almost always good, because once you look at some code, migrate a database, profile a section, it's very convenient (economically speaking) to flush any similar task of reasonable size in the same area.

Leap seconds are the opposite; they try to spread the time changes as much as possible, adding single seconds whenever possible. If they went for minutes instead of seconds, it's possible that a leap minute would be added once per century (or less); entire generations of computer scientists would not have to deal with it and Google could come and go (for good) without ever having to shift their clocks.

Humans don't really care if you add a minute at midnight of new years eve, in fact, it could even be fun to countdown twice for the new year, but for computers it would be a massive saving of engineering effort.


You have valid points there. But in that case, I think using the existing system of time zones (i.e., leap hours) may be a better idea than introducing a somewhat new mechanism of leap minutes. Also because leap seconds are accelerating, in a few thousand years' time we'd have leap minutes every month, so it's better to postpone any problems even further into the future.


Time is already "wrong" by an hour or more in many places: http://poisson.phc.unipi.it/~maggiolo/index.php/2014/01/how-...

I'd rather have no leap seconds until the accumulated error reaches 30 minutes and then modify time zone offsets (which is already done twice a year anyway).


The "broadcast time" (and NTP time) doesn't need to have leap seconds at all, you could consider UTC as a timezone too. You already need a table to convert between UTC and localtime, so why not use a table to convert between machine time and UTC too? http://www.ucolick.org/~sla/leapsecs/right+gps.html


This just pushes the problem somewhere else. Smoothing leap seconds gives clock skew of 500ms, requires no changes to software. Handling leap seconds correctly in UTC requires checking millions of lines of code and adding a new way to query the clock. Using TAI causes interoperability problems with other systems; there are 35 seconds between TAI and UTC, and now you have to check millions of lines of code to make sure you correct for leap seconds when you display time to the user. Ugh. Smoothing is the second-best option from a software engineering perspective. Whining about it until IERS stops using leap seconds is the first-best option, the astronomers will have to deal with it.


A great little bit of fiction, if you want to revel in the sense of how weird it is that our measures of time are effectively "by fiat": http://slatestarcodex.com/2013/11/03/the-witching-hour/


Dates are decided by fiat (always with a bias of keeping astronomical events at sync), timespan is very well defined and does not change at all.


Elapsed time does not change, but the current time is. We measure elapsed time based on a presumably stable measure -- the vibration of cesium under exposure to microwaves -- but we want 'now' to be based on the earth's rotation and location with respect to the sun, which is gradually decaying. Atomic vs Astronomical time.


Was hoping to read something that would blow my mind, but didn't :(

TL;DR for those who want to read it but don't really have the time: in the future some civilization only has a few objects from our present time left, one of which is digital clocks. At some point they noticed them going forwards in spring and backwards in autumn (DST presumably, but they don't know that). They think the previous civilization (us) modified time because the clocks are out of sync with the sun clocks and the stars. So it must be that we created an hour of our own: nine thousand hours a year from gods and one hour from men. The gods noticed what we did, killed us all and that's why we're no longer.


It's a cool story. How did you manage to read it and come out jaded?


One thing I've never understood is why we have to shift computer time to match celestial bodies. We have a definition of the second (in terms of a Caesium atom) and we've agreed on the epoch, so why can't we just leave it be?

I understand that if we want to know the time in 'human language' (in terms of the earth's rotation around the sun, and the specific timezone that we're in) we then have to apply these corrections. Surely that's simpler than every timekeeping device on the planet having to adjust a second? Electronic devices are much more likely to be sensitive to a second's difference than us humans are.



I think more of the internet should be displayed in plain text like this!


One of the nicer things about RFCs.


Except for that indentation (tries to make it look like a letter I guess)


Is there a reason why a leap second wouldn't be added if the difference between TAI and UTC is 35s?


Leap seconds are added to UTC to keep it within one second of UT1; the drift from TAI is unbounded.

In TAI, each second is the same length, and each day lasts the same number of seconds; UT1 is (conceptually, approximately) the solar time in Greenwich, so the length of seconds depends on the position of the earth; UTC has seconds of fixed length, like TAI, but occasionally has more or (potentially) fewer seconds in a day, to keep synch with UT1.

If you used TAI, eventually the sun would not be overhead at noon -- and after many millennia, the sun would be up during the "night" and down during the "day" (after around 70 000 years if leap seconds continue to be added at the same rate, but the whole point is that they aren't -- at the moment they seem to be accelerating).


Thank you for explaining the various time standards concisely. I think a lot of people don't know about TAI, but would prefer it to UTC if they did. Leap second handling has been responsible for some pretty serious software bugs. It's so bad that Google doesn't use them internally.

I think it makes the most sense for computers to use TAI internally, and have UTC as a time zone on top of that. Whenever there's a leap second, just push out a TZ update. Most applications don't have special code for time zone updates, which means a lower likelihood of bugs.

Using TAI internally would have another advantage: Computers with GPS receivers could just add 19 seconds to GPS time to get a very accurate TAI. (GPS time is defined to be 19 seconds behind TAI. That was UTC in 1980.)


Using TAI internally would be much better than what we have now. But another problem is that leap seconds are only announced 6 months in advance, which is not enough to update all computing devices (including embedded ones, etc.). See Poul Henning Kamp's article linked elsewhere in this discussion for an idea on how to address that.


Actually the most sensible thing would be to just use UT1 for wall clock time. The deviation of a UT1 second to TAI second (`d/dT UT1(T) - T` where `T = TAI time`) is small enough that your average time signal distribution system (NTP, radio time signal, etc.) can not even resolve it (less than a microsecond). Yes, with GPS time distribution or checking against an atomic clock you can resolve it. But it doesn't really matter (even if you go the Google-way and "smear out" a leap second, the deviation is less than a ms which is below the resolution of most operating systems' scheduler tick length).

Leap seconds are stupid, because they create discontinuities. Here's a nice picture: http://hpiers.obspm.fr/eop-pc/index.php?index=leapsecond&lan...

At home I hacked a OpenNTP based time server to hand out UT1 instead of UTC. Right now it's based on a simple software patch,

I yet have to a program to fetch the Bulletin B reports and parse them (ftp://hpiers.obspm.fr/iers/bul/bulb_new/bulletinb.dat). I don't know what they're smoking over there, but they're giving UT1 deviation relative to UTC instead of TAI. So first you've to get to UTC from TAI (respecting the leap seconds, gah), just so that you can go to UT1. Why not simply publish the value for UT1 - TAI instead?

Anyway everytime a report gets out, the clock rescaler in the UT1 timeserver gets adjusted apropriately (manually).

In the long run I plan to make this a small embedded project that uses GPS time to stabilize a high precision temperature compensated crystal oscillator^1 (TXCO), maybe I'll add a rubidum clock either, just because. And this is then to act as a stratum 1 time server on my local network.

----

[1] from my work I know that those have way less short term jitter than "naked" rubidium clocks; most precision frequency normals that come in a nice box with buttons and a display are actually TXCOs that are frequency stabilized by a rubidium clock.


> after around 70 000 years if leap seconds continue to be added at the same rate

Apparently it's only 3000 years:

"Leap seconds are not a viable long-term solution because the earth's rotation is not constant: tides and internal friction cause the planet to lose momentum and slow down the rotation, leading to a quadratic difference between earth rotation and atomic time. In the next century we will need a leap second every year, often twice every year; and 2,500 years from now we will need a leap second every month.

On the other hand, if we stop plugging leap seconds into our time scale, noon on the clock will be midnight in the sky some 3,000 years from now"

http://queue.acm.org/detail.cfm?id=1967009


Defining time based on "sun directly overhead at a particular place at noon" is, itself, a holdover from navigation and seafaring. The time difference between when the sun is directly overhead where you are and directly overhead at a particular place is the latitude difference between the two places.

That's the historical reason why we have a time definition that is locked to the rotation of the Earth. Importantly, anyone navigating based on that principle will get skewed answers if when noon is at a particular place changes.


Interesting; do you have a source for that? I'd have thought that it would be most useful for people on land, too, to have noon at a well-defined time.


It's not a question of moving noon around by hours, but moving noon around by minutes or seconds.

It's not sourced since it's what I remember off of navigational history. Whether solar noon is at 11:45 or 12:15 makes little difference for your personal life, but observing solar noon at 11:45 vs 12:15 is a difference of 1/48th the way around the globe longitudinally.


35s is the cumulative sum of all leap seconds so far


Follow-up question: what's the condition for a leap second to be added?


Changes in Earth rotation speed. Have a look at wikipedia's "Leap second". It's explained pretty well with some historic data.


It sounds like the leap seconds all have the same sign. If corrections are always being applied in the same direction, that implies a second is not as close to 1/86400 of a solar day as it could be.

Maybe we should redefine the second. It is currently defined as "9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom."

9,192,631,770 * 35s / 40y ~= 255. If we adjust the definition by 255 periods, TIA and UTC would stay in sync much better.


The basic problem is that the solar day keeps getting longer. The definition of the second was based on a length-of-day averaged over the 19th century, so now it's too short. We could keep redefining the second every so often, but that sounds even worse than the system we have now. :)


Scientists would hate you if you redefined the second by such a large amount. And since we've delegated the definition of a second to scientists (notably physicists), it's not going to happen.


While all leap seconds so far have been positive, that is not a requirement. Negative leap seconds are a possibility - one accounted for in any valid implementation.

More significant for this particular suggestion, leap seconds do not occur/become necessary on any nice linear basis. They are tragically bound to the rotation of the Earth, which does not care about what would be nice for us and changes in a non-linear fashion.


Do not upset the authorities responsible for the distribution of time.


Oh, this lot don't just take care of time.

Seriously, check the heading on that memo. You know how people say "money makes the world go round", or sometimes "love makes the world go round"? False. Why do you think they're called the International Earth Rotation Service?


Given the immense cost of leap-seconds, why not just wait for 60 of them to accumulate and add a leap-minute every 60ish years or so?

Being almost a minute behind would have far less impact than all of these unpredictable, tiny changes to UTC.


Can anyone explain why leap seconds exist vs. a gradual skewing of atomic time to match the earth's rotation. Like say, "from date X to Y, civil time will be atomic time slowed by Z parts-per-billion". It just seems like such a better solution than this discontinuity that occurs once every few years and _surely_ everyone would have prepared and tested for. Civil time just doesn't require the accuracy where it actually matters, and if it does matter to you, use a continuous timescale like TAI or compute the current skew offset using a simple formula. Really, I love leap seconds as geeky trivia, but they seem so impractical.


Whose algorithm do we use? How much time has deviated is arguable and ties in with things like rotation, earthquakes, inertia, tidal forces, etc. Would your proposal be yet another buggy and heavily patched update we need to worry about constantly being updated on the world's IT infrastructure? Incorporating a simple leap second via fiat seems to be a simple and sane solution.

Then what of high accuracy computing? If my clock randomly adds or substracts 2ms a day, that could fuck up my calculations. Better to have things work in a classical sense and to just tack on or remove a second once in a while and be prepared for it in advance. 25 leap seconds have been added since 1972, according to wikipedia. Why make this harder than it needs to be?

Atomic clocks are right and human calendering is arbitrary/wrong. Trying to merge the two automatically can only cause more problems as our arbitrary political and cultural nature could spill over into our scientific nature. See trying to legislate the value of pi in Indiana for an example. The current method works the other way. We push science into our calendering. We keep them nice and separate.


You would continue to use the IERS as the authority on what the true solar time is. They would continue to declare when leap seconds are, but instead of inserting them as a discontinuity, you would simply amortize them slowly over a long period.

Like I said, if you demand high accuracy, use a timescale like TAI where seconds are really SI seconds. But very few people need that kind of accuracy. My computer clock probably already drifts more than 2ms a day. Anyone tracking time for civil purposes is just syncing from a more authoritative source anyway. My computer talks to upstream ntp servers. They could/should hide the fact that leap seconds exist from me entirely. Some of Google's ntp servers already do this.


Calculating the time delta between two events becomes very difficult when your standard unit (seconds) is shifting in size.

>that occurs once every few years and _surely_ everyone would have prepared and tested for.

yeah... like y2k, ipv6 and the US conversion to the metric system!


But don't we already have this problem? If, for example, I wanted to calculate exactly how many seconds old I am, I would probably do it the naive way, adding up the years, months, days, hours, etc. But to get the truly exact answer, I'd also have to look up how many leap seconds occurred during my life, making the calculation roughly just as complex. Or I could not bother and just report the result in "civil" seconds, which are slightly and variably longer than "atomic" seconds. If I cared enough about the difference, I probably would be using a true atomic timescale (without leap seconds) anyway.


If you were born before 1972, it's impossible because that's when UTC became reality. And really, do you think you know the time of your birth to within, say, 35 seconds (which is the number of leap seconds you have to add)? Do you think the clock at the hospital was accurate to within 35 seconds?


That was kind of my point -- that second-perfect accuracy over such long time spans is rarely needed or even possible.


The second is the physical unit of time; if you tried to skew seconds to be longer or shorter you would break physics (physical constants would change in value, it would be impossible to compare current experiments with previous experiments...).

Would it be smart for civil time to be based on another unit, decoupled from the physical second? Yes, but no-one thought of that earlier, in the same way that no-one bothers to use LVM on the first server they set up. This sort of thing wasn't thought of when governments were first setting up time laws back in 18xx, and many countries' laws included reasonable-sounding provisions like "The average difference between 12:00pm and solar noon shall be zero".

There's a proposal to abolish leap seconds, but it's been pushed back to 2015 to give national governments more time to amend such laws. I suspect and hope that we'll get rid of them sooner or later, but changing one nation's laws is a slow process; changing 190-odd much more so.


That's why I made a distinction between civil time and atomic time. If you're doing stuff with enough precision that differences of parts-per-billion matter, then use SI seconds, which aren't changing. But civil time is for stuff like "what is the current time right now?", a question whose answer hardly ever needs more than 3-4 digits of precision.

Getting rid of leap seconds entirely just seems lazy, kicking the can 600 years down the road with some nebulous "leap hour" concept. Especially so when there is already a timescale without leap seconds: TAI. I'm not surprised this proposal came from the US... "let the next generation deal with it" seems to be our solution to most problems.


What you are describing already exists, it is called UT1. However, most hardware does not support subtle changes in clock speed.


Leap seconds should be replaced with a leap hour or day or something. Then we could just ignore this forever and hopefully people eons in the future will realise having time sync'd with the sun is ... rather useless.


Au contraire, it is why civil time exists: to be able to talk about when something happens during the day, and everyone will be confused if 9 AM is not somewhere near the beginning of the work day. Leap hours or days are a cop-out, it's just shifting the problem and making it worse.


Clearly the proper solution is some space-elevator type device with a rocket on the end to speed up/slow down the rotation as needed and keep our pesky planet turning on schedule!

Would love to see this as a candidate for XKCD 'What If'.


It's already been done. https://what-if.xkcd.com/26/


Thank you for the service announcement. I will continue to go about my day expecting the second to not be 1 off.

Edit: No need to get so worked up over a joke people.


These things matter a lot for time libraries.


It's pretty "crazy" that all software that handles UTC needs to have an access to the internet (or software updates) every 6 months. You simply can't leave, say, a sensor somewhere for a year and expect it to record events in UTC.

Sure, you can record events in something else than UTC and convert it to UTC at a later time.


Unless you have some fancy hardware around running your clock you've got bigger issues; as a general rule you can't run a computer and expect the time to remain +/- 1 second of UTC.


NTP exists so we don't have to own this hardware. You can easily expect to be within milliseconds of UTC if this is working properly and you're syncing with stratum 1 servers.

And better yet, the hardware to do this yourself isn't really "fancy" anymore: http://www.newegg.com/Product/Product.aspx?Item=9SIA19G0A497...


Uh, NTP needs to have access to the internet (or a specialized hardware clock).

GPS is also relying on a network.

Running a calibrated atomic clock yourself is definitely in 'fancy' territory.


The statement was that you needed fancy hardware to get < 1 sec accuracy to UTC which is patently false, network availability or not you can easily (without an atomic clock) get that level of accuracy.


You need fancy hardware to get that level of accuracy in any way that doesn't also automatically give you leap second announcements, which was the problem we were talking about. NTP announces them, GPS announces them, the various longwave time stations announce them...


Eh. That hardware $300 and requires a network connection too. If that's reasonable to do, is network access to distribute leap-second information really a problem?

(There may be cases where the answer is yes, but... we digress.)


Yep. That's why unixtime is actually a pretty logical measure.


Yes, they do. However, adding the second is not the default, so announcement that it's not going to be added is not as interesting. Also, I doubt "time libraries" "people" get their news from HN, on a day it could happen.


On the other hand, the HN public is one that should be acutely aware of leap second issues, because it's urgently necessary that all programmers should have some basic knowledge of this issue. So it's a good idea to draw attention to this a few times per year.


But probably not at such short notice.


Doesn't seem short at all: "Paris, 16 January 2014 [...] NO leap second will be introduced at the end of June 2014."


The notice was posted 6 months ago.


It is standard practice to post such notices six months in advance. That's what the IERS normally does at least.




Applications are open for YC Winter 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: