Leap seconds are the opposite; they try to spread the time changes as much as possible, adding single seconds whenever possible. If they went for minutes instead of seconds, it's possible that a leap minute would be added once per century (or less); entire generations of computer scientists would not have to deal with it and Google could come and go (for good) without ever having to shift their clocks.
Humans don't really care if you add a minute at midnight of new years eve, in fact, it could even be fun to countdown twice for the new year, but for computers it would be a massive saving of engineering effort.
I'd rather have no leap seconds until the accumulated error reaches 30 minutes and then modify time zone offsets (which is already done twice a year anyway).
TL;DR for those who want to read it but don't really have the time: in the future some civilization only has a few objects from our present time left, one of which is digital clocks. At some point they noticed them going forwards in spring and backwards in autumn (DST presumably, but they don't know that). They think the previous civilization (us) modified time because the clocks are out of sync with the sun clocks and the stars. So it must be that we created an hour of our own: nine thousand hours a year from gods and one hour from men. The gods noticed what we did, killed us all and that's why we're no longer.
I understand that if we want to know the time in 'human language' (in terms of the earth's rotation around the sun, and the specific timezone that we're in) we then have to apply these corrections. Surely that's simpler than every timekeeping device on the planet having to adjust a second? Electronic devices are much more likely to be sensitive to a second's difference than us humans are.
In TAI, each second is the same length, and each day lasts the same number of seconds; UT1 is (conceptually, approximately) the solar time in Greenwich, so the length of seconds depends on the position of the earth; UTC has seconds of fixed length, like TAI, but occasionally has more or (potentially) fewer seconds in a day, to keep synch with UT1.
If you used TAI, eventually the sun would not be overhead at noon -- and after many millennia, the sun would be up during the "night" and down during the "day" (after around 70 000 years if leap seconds continue to be added at the same rate, but the whole point is that they aren't -- at the moment they seem to be accelerating).
I think it makes the most sense for computers to use TAI internally, and have UTC as a time zone on top of that. Whenever there's a leap second, just push out a TZ update. Most applications don't have special code for time zone updates, which means a lower likelihood of bugs.
Using TAI internally would have another advantage: Computers with GPS receivers could just add 19 seconds to GPS time to get a very accurate TAI. (GPS time is defined to be 19 seconds behind TAI. That was UTC in 1980.)
Leap seconds are stupid, because they create discontinuities. Here's a nice picture: http://hpiers.obspm.fr/eop-pc/index.php?index=leapsecond&lan...
At home I hacked a OpenNTP based time server to hand out UT1 instead of UTC. Right now it's based on a simple software patch,
I yet have to a program to fetch the Bulletin B reports and parse them (ftp://hpiers.obspm.fr/iers/bul/bulb_new/bulletinb.dat). I don't know what they're smoking over there, but they're giving UT1 deviation relative to UTC instead of TAI. So first you've to get to UTC from TAI (respecting the leap seconds, gah), just so that you can go to UT1. Why not simply publish the value for UT1 - TAI instead?
Anyway everytime a report gets out, the clock rescaler in the UT1 timeserver gets adjusted apropriately (manually).
In the long run I plan to make this a small embedded project that uses GPS time to stabilize a high precision temperature compensated crystal oscillator^1 (TXCO), maybe I'll add a rubidum clock either, just because. And this is then to act as a stratum 1 time server on my local network.
 from my work I know that those have way less short term jitter than "naked" rubidium clocks; most precision frequency normals that come in a nice box with buttons and a display are actually TXCOs that are frequency stabilized by a rubidium clock.
Apparently it's only 3000 years:
"Leap seconds are not a viable long-term solution because the earth's rotation is not constant: tides and internal friction cause the planet to lose momentum and slow down the rotation, leading to a quadratic difference between earth rotation and atomic time. In the next century we will need a leap second every year, often twice every year; and 2,500 years from now we will need a leap second every month.
On the other hand, if we stop plugging leap seconds into our time scale, noon on the clock will be midnight in the sky some 3,000 years from now"
That's the historical reason why we have a time definition that is locked to the rotation of the Earth. Importantly, anyone navigating based on that principle will get skewed answers if when noon is at a particular place changes.
It's not sourced since it's what I remember off of navigational history. Whether solar noon is at 11:45 or 12:15 makes little difference for your personal life, but observing solar noon at 11:45 vs 12:15 is a difference of 1/48th the way around the globe longitudinally.
Maybe we should redefine the second. It is currently defined as "9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom."
9,192,631,770 * 35s / 40y ~= 255. If we adjust the definition by 255 periods, TIA and UTC would stay in sync much better.
More significant for this particular suggestion, leap seconds do not occur/become necessary on any nice linear basis. They are tragically bound to the rotation of the Earth, which does not care about what would be nice for us and changes in a non-linear fashion.
Seriously, check the heading on that memo. You know how people say "money makes the world go round", or sometimes "love makes the world go round"? False. Why do you think they're called the International Earth Rotation Service?
Being almost a minute behind would have far less impact than all of these unpredictable, tiny changes to UTC.
Then what of high accuracy computing? If my clock randomly adds or substracts 2ms a day, that could fuck up my calculations. Better to have things work in a classical sense and to just tack on or remove a second once in a while and be prepared for it in advance. 25 leap seconds have been added since 1972, according to wikipedia. Why make this harder than it needs to be?
Atomic clocks are right and human calendering is arbitrary/wrong. Trying to merge the two automatically can only cause more problems as our arbitrary political and cultural nature could spill over into our scientific nature. See trying to legislate the value of pi in Indiana for an example. The current method works the other way. We push science into our calendering. We keep them nice and separate.
Like I said, if you demand high accuracy, use a timescale like TAI where seconds are really SI seconds. But very few people need that kind of accuracy. My computer clock probably already drifts more than 2ms a day. Anyone tracking time for civil purposes is just syncing from a more authoritative source anyway. My computer talks to upstream ntp servers. They could/should hide the fact that leap seconds exist from me entirely. Some of Google's ntp servers already do this.
>that occurs once every few years and _surely_ everyone would have prepared and tested for.
yeah... like y2k, ipv6 and the US conversion to the metric system!
Would it be smart for civil time to be based on another unit, decoupled from the physical second? Yes, but no-one thought of that earlier, in the same way that no-one bothers to use LVM on the first server they set up. This sort of thing wasn't thought of when governments were first setting up time laws back in 18xx, and many countries' laws included reasonable-sounding provisions like "The average difference between 12:00pm and solar noon shall be zero".
There's a proposal to abolish leap seconds, but it's been pushed back to 2015 to give national governments more time to amend such laws. I suspect and hope that we'll get rid of them sooner or later, but changing one nation's laws is a slow process; changing 190-odd much more so.
Getting rid of leap seconds entirely just seems lazy, kicking the can 600 years down the road with some nebulous "leap hour" concept. Especially so when there is already a timescale without leap seconds: TAI. I'm not surprised this proposal came from the US... "let the next generation deal with it" seems to be our solution to most problems.
Would love to see this as a candidate for XKCD 'What If'.
Edit: No need to get so worked up over a joke people.
Sure, you can record events in something else than UTC and convert it to UTC at a later time.
And better yet, the hardware to do this yourself isn't really "fancy" anymore:
GPS is also relying on a network.
Running a calibrated atomic clock yourself is definitely in 'fancy' territory.
(There may be cases where the answer is yes, but... we digress.)