> And all it takes to store a time like that is a 64-bit integer and it is very convenient. And a lot of software do precisely that. Most timestamps are just that: they don't care about "real world" details like leap-seconds, {23,24,25}hours per day, etc.
Your claim that using Unix Time saves you from worrying about leap seconds is incorrect. Unix Time goes backwards when a leap second occurs, which can screw up a lot of software. Check out Google's solution to the problem, which is to "smear" the leap second over a period of time before it actually occurs: http://googleblog.blogspot.in/2011/09/time-technology-and-le...
Practically no software uses true seconds since the epoch; if it did then simple operations like turning an epoch time into a calendar date would require consulting a table of leap seconds, and would give up the invariant that every day is exactly 86,400 seconds. Whether this was the right decision or not is debatable, but it is a mistake to think that using Unix Time saves you from all weirdness surrounding civil time.
Your claim that using Unix Time saves you from worrying about leap seconds is incorrect. Unix Time goes backwards when a leap second occurs, which can screw up a lot of software. Check out Google's solution to the problem, which is to "smear" the leap second over a period of time before it actually occurs: http://googleblog.blogspot.in/2011/09/time-technology-and-le...
Practically no software uses true seconds since the epoch; if it did then simple operations like turning an epoch time into a calendar date would require consulting a table of leap seconds, and would give up the invariant that every day is exactly 86,400 seconds. Whether this was the right decision or not is debatable, but it is a mistake to think that using Unix Time saves you from all weirdness surrounding civil time.