Hacker News new | comments | show | ask | jobs | submit login

Time is tricky but there is a simple fact: it exists such a thing as the number of actual seconds elapsed since 'time x' (typically the Unix epoch but anything else will do too). No leap seconds issues. No 25 hours-day issues. No 23 hours-day issues. No 59 seconds minutes issues. No 61 seconds minutes issues.

And all it takes to store a time like that is a 64-bit integer and it is very convenient. And a lot of software do precisely that. Most timestamps are just that: they don't care about "real world" details like leap-seconds, {23,24,25}hours per day, etc.

Because, in many case, you really don't give a flying shit about the "real-world" time.

What is your point? That a server that needs to open trade precisely at 9am has to take leap seconds into account? Sure. The article ain't disputing that.

But a lot of application basically only need timestamps and everything becomes much simpler when one starts to reason in milliseconds since the epoch instead of "real world" time.

Btw... Many of your "real issues" are due to programmers not realizing that they could have realized the exact same system but way more resilient had they knew to use timestamps instead of "real world" time.

There was an amazing paper by Google on the subject by the way, where they explained how they were reconciliating their hundreds of thousands of servers' time by allowing it to be "not correct" over x minutes / hours to dodge many issues.

And they clearly did emphasize that: a) these problems shouldn't have occured in the first place and were due to poorly designed programs (poorly designed in that they relied on real-world time instead of simply internally using milliseconds since the epoch) and b) most programs on earth do not give a flying ^^^^ if a time displayed to the user is one third of a second off...




> Time is tricky but there is a simple fact: it exists such a thing as the number of actual seconds elapsed since 'time x' (typically the Unix epoch but anything else will do too).

Yeah. Hmm. Clearly an opinion expressed without either reading, comprehension, or experience.

Why would I say that? Well, consider your statement about "actual seconds elapsed since epoch". Leaving aside the fact that Unix Time enumerates no such thing (and it was Unix Time I was discussing), there are such things as time frames and relativistic effects which come into play when doing fine scientific measurement, astronomical work, and ground satellite communications.

There are levels to dealing with time and the linked article makes a decent first order stab at climbing the ladder, even ground based programmers need to be wary of the leap seconds issue and the plethora of time standards with minute but significant (in some context or another) issues.

When you've cooled down a little consider the title "What every programmer should know ...". Had it been "What most programmers can skate by on" I probably wouldn't have bothered to comment.

I agree there are many good papers on the subject, going back decades, I'm betting the google paper you speak of references a classic one on time stamps and relativity as some times local event order trumps universal order (and in fact the paper pivots on the observation that universal order doesn't really exist, just effective order).


I can't interpret this as anything other than nitpicky. Are you suggesting that every programmer should know about relativistic effects of time dilation in controlled scientific experiments?

I don't doubt that you have valuable experience and information to lend to the discussion, but your frame seems aggressively negative towards the OP with no discernible reason.


I posted shortly before midnight my time so excuse the delay in replying.

The OP article suggests that UnixTime is the answer for every programmer. My take home message to every programmer that uses UnixTime is to be aware that it's non linear and has hiccups that will bite them every few years if not taken into account. This is not so much aggressively negative as it is a simple statement of fact garnered from years of experience.

My message to every programmer that works in a distributed system is that they should read "Time, Clocks, and the Ordering of Events in a Distributed System" (Lamport 1978) which uses observations and arguments from relativity to comment on the manner in which events propagate outwards from sources.

As for limiting awareness to "controlled scientific experiments", no, I'm not advocating that at all as time slip (something that has many causes outside of dilation) pops up all over the place these days, for example many things rely on GPS time which is something else that is non linear and periodically updated. I'd suggest that anyone writing software that relies on second / sub second granularity should be aware of where their fiducial time marks come from and what hiccups there are in that system.


> Are you suggesting that every programmer should know about relativistic effects of time dilation in controlled scientific experiments?

No, he's declaring that not every programmer can afford to ignore these.


Hi,

I find your opinion and experience on this matter very interesting. Could you possibly describe your experience or projects you've worked on that could shed some insight on the complicated issues time causes?


> And all it takes to store a time like that is a 64-bit integer and it is very convenient. And a lot of software do precisely that. Most timestamps are just that: they don't care about "real world" details like leap-seconds, {23,24,25}hours per day, etc.

Your claim that using Unix Time saves you from worrying about leap seconds is incorrect. Unix Time goes backwards when a leap second occurs, which can screw up a lot of software. Check out Google's solution to the problem, which is to "smear" the leap second over a period of time before it actually occurs: http://googleblog.blogspot.in/2011/09/time-technology-and-le...

Practically no software uses true seconds since the epoch; if it did then simple operations like turning an epoch time into a calendar date would require consulting a table of leap seconds, and would give up the invariant that every day is exactly 86,400 seconds. Whether this was the right decision or not is debatable, but it is a mistake to think that using Unix Time saves you from all weirdness surrounding civil time.


Things do indeed become much simpler when you reason in milliseconds since the epoch. So why is it so rare to do so? UNIX time is not milliseconds since the epoch. It's milliseconds since the epoch, plus the leap seconds that have accumulated over that period. The only widely used timebase that's pure milliseconds since the epoch that I'm aware of is GPS time, and basically everything adds in the leap seconds before actually using that number.


I think you mean that unix time is seconds since the epoch minus leap seconds.


All depends on what sign you assign to the leap seconds!


Ah, but negative numbers are not included in the set of counting numbers, and your initial comment referred to "the leap seconds" with no reference to "the leap second offset" or other verbiage that might imply anything other than a simple count of leap seconds.

Alas, I'm afraid you've no other option but to admit a minor error, as traumatic as that may be.


I'll never admit to an error, although perhaps a negative value for correctness.


> Time is tricky but there is a simple fact: it exists such a thing as the number of actual seconds elapsed since 'time x'

Yes and no: it might just exist theoretically, but we have no way to get at it. The closest we have is TAI, which is only an approximation of the time elapsed at mean sea level on the earthly geoid, because clocks fall victim to gravitational time dilatation and compression.

To accurately measure time you'd need a clock sitting perfectly still in space, and all other clocks in the universe would slowly drift behind it.


Not even that; relativity doesn't allow a single special frame of reference. There's no such thing as universal "time elapsed since x". Putting the clock in space would be as arbitrary choice as putting it on my roof (albeit definitely more practical).


I think the one who solves this problem ( at least for humanity ) would get a Nobel.

After the humans will wander thru space ( at some time - very probable), there will no time reference, but only time intervals ( like, day on a spaceship has 24h , etc ). So in this case, you would measure 86400 seconds and call it a new day. No more leap seconds, etc.

Now the UNIX makes sense: count seconds since a certain event in time and meajure from there on, internally. Want to display it? Then use special computation to render it in the format ( read timezone, add relativistic skew, etc ).


You're forgetting to factor in accuracy/resolution. TAI is completely perfect down to the picosecond level and further. That means you do have the number of actual seconds, milliseconds, microseconds, nanoseconds...

You only encounter issues with TAI once you get down to femtosecond or smaller levels.


> TAI is completely perfect down to the picosecond level and further.

I may have worded it badly, let's try again: TAI is "completely perfect" for the approximation it is: time elapsed at the geoid, which is a theoretical construct. That's an approximation for both "experienced time" and for anything which could be called "absolute time"


Oh that's easy to solve. Just mandate everyone live at sea level.


And stop the Earth from spinning, so we can get rid of that pesky oblateness and gravitational delta between the equator and poles.


No need to get all riled up about it. It's just a heads-up, don't follow the advice in this article if you're working on something that needs to deal with time on the seconds level.

Obviously you don't need to go trough all that trouble if you simply want to display the date you published a post on your blog.


Because, in many case, you really don't give a flying shit about the "real-world" time.

But there are loads of cases where your computer/programme needs to talk in "real-world" time. So you can't avoid the problem.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: