Hoping the 1,500M era is nothing but good memories of getting my hands dirty again.
1 400 000 000 was 5/13/2014, 11:53:20 AM
1 600 000 000 will be 9/13/2020, 7:26:40 AM
1 000 000 000 was 9/8/2001, 8:46:40 PM
2 000 000 000 will be 5/17/2033, 10:33:20 PM
2^32-1 (4 294 967 295) will be 2/7/2106, 12:28:15 AM
100 million seconds is a little over 3 years
1 billion seconds is almost 32 years.
Although perhaps you have assumed that everyone already knows it!
Jan 13 19:13:30 2018 UTC will be 0x5A5A5A5A
0x66666666 will be Jun 10 2:35:18 2024 UTC
We just missed a good one:
Sep 22 2016 16:00:00
in base10 1474560000
in base16 0x57E40000
Obviously. Programmers celebrating 1500000000 just make me sad.
I had assumed there would be an error in there about forgetting that years don't have an integral number of days, but even at exactly 36500 days per century, a nanocentury is still over 3.15 seconds.
(nano century - π)/π = 0.45%
So a nano century is equal to π seconds within half a percent which is more than good enough for casual uses as an approximation.
But when you're having fun with pi, you generally measure similarity in terms of being accurate to some number of decimal places, and this statistic is accurate to an unimpressive one place (3.1).
Now, 500 million seconds later I'll be watching the counter on an ubuntu laptop. Not quite as late as I'm in Washington DC on business, and I abandoned /. a couple of years ago.
It amazes me how much changes, but also how little things change.
while true; do echo $((1500000000 - `date +"%s"`)); sleep 1; done
$ watch -n1 'echo $((1500000000 - $(date +%s)))'
xclock -utime -update 1
while true; do date +%s; sleep 1; done
I don't know what's going on with some of the encodings there, but I expect people were simply sending pre-UTF8 encodings.