
The Unix timestamp will begin with 16 this Sunday - dezmou
https://www.unixtimestamp.com/
======
iso1210
I remember staying up late (UK) at the Billenium (2:46 AM Sunday Sep 9th
2001), thinking that was bound to be the most newsworthy event of the week,
watching the seconds tick past on a "while(sleep 1); do date" loop, with
slashdot in one window, IRC in another, all running on an enlightenment window
manager.

500M seconds later I was in Washington DC in a hotel, watching it tick up on
an rxvt on my laptop, with HN in a window.

Who knows where I'll be in 2033, hopefully not in Europe as I'll be too old
for night shifts, but wherever I am, I suspect it will have a bash prompt.

~~~
dripton
A former employer's telephony software had a 9-character intstring field for
the Unix timestamp, so there was a bug when it rolled over to 1000000000 in
2001. Pretty rare, I think. Unix timestamps back then were usually 32-bit
ints, so good until 2038. And hopefully they'll be 64 bits everywhere that
matters well before 2038.

~~~
wongarsu
I was involved in building a system using 32-bit ints as timestamps ten years
ago. They are still sold, and since it's industrial equipment running on 16
bit microcontrollers I have every reason to believe most of them will still be
around in 2038.

I don't think I was at the only company doing this. Few people seem to care
about issues that will happen after their retirement. Expect lots of
industrial stuff to work just a bit worse around 2038 (and 2036, PIC
microcontrollers fail a bit sooner)

~~~
hazeii
Why will PIC's fail sooner? (and which ones, 8, 16 or 32bit?)

~~~
wongarsu
I worked with the 16 bit PIC24f. I don't have the source handy to check, but
according to a forum entry "The provided gmtime() actually fails earlier than
2038. The year wraps around when the time_t input goes beyond 0x7C55817F or
Thu Feb 7 06:28:15 2036." [1]

1:
[https://www.microchip.com/forums/m522929.aspx](https://www.microchip.com/forums/m522929.aspx)

~~~
hazeii
Thanks - so seems it's an issue with their library, rather than something to
with PIC's per se.

------
susam
It is going to be at 2020-09-13 12:26:40 UTC.

Python:

    
    
      $ python3 -q
      >>> from datetime import datetime
      >>> datetime.utcfromtimestamp(1_600_000_000)
      datetime.datetime(2020, 9, 13, 12, 26, 40)
    

GNU date (Linux):

    
    
      $ date -ud @1600000000
      Sun Sep 13 12:26:40 UTC 2020
    

BSD date (macOS, FreeBSD, OpenBSD, etc.):

    
    
      $ date -ur 1600000000
      Sun Sep 13 12:26:40 UTC 2020
    

All such dates (in UTC) until the end of the current century:

    
    
      $ python3 -q
      >>> from datetime import datetime
      >>> for t in range(0, 4_200_000_000, 100_000_000): print(f'{t:13_d} - {datetime.utcfromtimestamp(t).strftime("%Y-%m-%d %H:%M:%S")}')
      ...
                  0 - 1970-01-01 00:00:00
        100_000_000 - 1973-03-03 09:46:40
        200_000_000 - 1976-05-03 19:33:20
        300_000_000 - 1979-07-05 05:20:00
        400_000_000 - 1982-09-04 15:06:40
        500_000_000 - 1985-11-05 00:53:20
        600_000_000 - 1989-01-05 10:40:00
        700_000_000 - 1992-03-07 20:26:40
        800_000_000 - 1995-05-09 06:13:20
        900_000_000 - 1998-07-09 16:00:00
      1_000_000_000 - 2001-09-09 01:46:40
      1_100_000_000 - 2004-11-09 11:33:20
      1_200_000_000 - 2008-01-10 21:20:00
      1_300_000_000 - 2011-03-13 07:06:40
      1_400_000_000 - 2014-05-13 16:53:20
      1_500_000_000 - 2017-07-14 02:40:00
      1_600_000_000 - 2020-09-13 12:26:40
      1_700_000_000 - 2023-11-14 22:13:20
      1_800_000_000 - 2027-01-15 08:00:00
      1_900_000_000 - 2030-03-17 17:46:40
      2_000_000_000 - 2033-05-18 03:33:20
      2_100_000_000 - 2036-07-18 13:20:00
      2_200_000_000 - 2039-09-18 23:06:40
      2_300_000_000 - 2042-11-19 08:53:20
      2_400_000_000 - 2046-01-19 18:40:00
      2_500_000_000 - 2049-03-22 04:26:40
      2_600_000_000 - 2052-05-22 14:13:20
      2_700_000_000 - 2055-07-24 00:00:00
      2_800_000_000 - 2058-09-23 09:46:40
      2_900_000_000 - 2061-11-23 19:33:20
      3_000_000_000 - 2065-01-24 05:20:00
      3_100_000_000 - 2068-03-26 15:06:40
      3_200_000_000 - 2071-05-28 00:53:20
      3_300_000_000 - 2074-07-28 10:40:00
      3_400_000_000 - 2077-09-27 20:26:40
      3_500_000_000 - 2080-11-28 06:13:20
      3_600_000_000 - 2084-01-29 16:00:00
      3_700_000_000 - 2087-04-01 01:46:40
      3_800_000_000 - 2090-06-01 11:33:20
      3_900_000_000 - 2093-08-01 21:20:00
      4_000_000_000 - 2096-10-02 07:06:40
      4_100_000_000 - 2099-12-03 16:53:20

~~~
ericbarrett
The datetime package is one of the best things about Python, and dare I say
one of the best general purpose calendar modules ever written. It’s just so
_practical_.

~~~
wwright
It still can’t parse ISO8601 :(

~~~
larkeith
`datetime.fromisoformat()`, as of 3.7 :D

Though `dateutil` is still recommended for most cases.

~~~
wwright
That parses a subset and is only guaranteed to be compatible with Python’s
“.toisoformat()”. (I imagine it would be backwards compatible to expand it to
cover all of ISO8601 and I can’t tell why they haven’t.)

~~~
londt8
ISO8601 is quite large, they probably don't want to ship that big standard
library modules which would parse 100% of ISO8601

------
jezze
When we one day become a space faring civilization we will probably stop using
the gregorian calendar because why would you use that on say Mars?

But there really is no reason to get rid of the unix timestamp as a measure of
time and this measure may stay around for a long time. This may mean that
people in the future might consider 1970 as year zero where modern
civilization began.

~~~
BoppreH
There is a very good reason to eventually abandon it: leap seconds. Unix time
goes back one second whenever there is a leap second on Earth.

It's extremely weird and IMHO completely ruins the purpose of a timestamp, but
it's a compromise for backwards compatibility, since Unix time was created
before leap seconds. This hack ensures that the number of seconds in a day
remains fixed, an assumption of many systems at the time.

~~~
dylan604
Isn't a second just how long light takes to travel 299,792,458 m?

Also, "since Unix time was created before leap seconds." I found this
interesting for the simple reason I've never spent any time thinking about
Epoch vs Leap Second histories.

~~~
sltkr
Yes, but the point is that the UNIX timestamp doesn't count the the number of
seconds elapsed since January 1, 1970. It counts 86400 seconds per day,
regardless of how many seconds the day actually has (which can vary due to
leap seconds).

~~~
bitpow
Is this accurate though? Isn’t the number of seconds since 1970 absolute, and
it’s up to the library generating the Gregorian date to take into account leap
seconds? I suppose if these libraries are not taking into account leap
seconds, then the actual rollover will be a few seconds earlier (or later?)
than what we think.

~~~
fanf2
See the definition of seconds since the epoch at
[https://pubs.opengroup.org/onlinepubs/9699919799.2018edition...](https://pubs.opengroup.org/onlinepubs/9699919799.2018edition/basedefs/V1_chap04.html#tag_04_16)

    
    
      tm_sec + tm_min*60 + tm_hour*3600 + tm_yday*86400 +
          (tm_year-70)*31536000 + ((tm_year-69)/4)*86400 -
          ((tm_year-1)/100)*86400 + ((tm_year+299)/400)*86400

------
fred256
I hope you are running an up-to-date Splunk version:

Beginning on September 13, 2020 at 12:26:39 PM Coordinated Universal Time
(UTC), un-patched Splunk platform instances will be unable to recognize
timestamps from events with dates that are based on Unix time, due to
incorrect parsing of timestamp data.

[https://docs.splunk.com/Documentation/Splunk/latest/ReleaseN...](https://docs.splunk.com/Documentation/Splunk/latest/ReleaseNotes/FixDatetimexml2020)

~~~
moralestapia
I'm always left wondering how something like this could happen. I kind of get
Y2K and stuff like overflows ... but this one? Really? Did someone put a regex
like /^15... to "match" dates?

~~~
lukebennett
Yep, that's exactly what Splunk have done - scroll down the release notes
linked to by the grandparent and the faulty regex is shown.

What's super daft is the proposed fix is only a further sticking plaster,
adding support for the 16... range (and the 2020s decade) rather than all
future dates. So in a couple of years a further patch will be needed...

~~~
Drdrdrq
To fix the fire this is probably the best solution because the risk of
unintended side effects is very low. I would just hope it is then followed by
a proper fix.

------
NGC404
All dates describing pattern:

code:

#!/bin/bash

for t in $(seq 0 100000000 $((2* * 31))); do date -u -d @$t +'%s -> %c'; done

output:

0 -> Do 01 Jan 1970 00:00:00 UTC

100000000 -> Sa 03 Mär 1973 09:46:40 UTC

200000000 -> Mo 03 Mai 1976 19:33:20 UTC

300000000 -> Do 05 Jul 1979 05:20:00 UTC

400000000 -> Sa 04 Sep 1982 15:06:40 UTC

500000000 -> Di 05 Nov 1985 00:53:20 UTC

600000000 -> Do 05 Jan 1989 10:40:00 UTC

700000000 -> Sa 07 Mär 1992 20:26:40 UTC

800000000 -> Di 09 Mai 1995 06:13:20 UTC

900000000 -> Do 09 Jul 1998 16:00:00 UTC

1000000000 -> So 09 Sep 2001 01:46:40 UTC

1100000000 -> Di 09 Nov 2004 11:33:20 UTC

1200000000 -> Do 10 Jan 2008 21:20:00 UTC

1300000000 -> So 13 Mär 2011 07:06:40 UTC

1400000000 -> Di 13 Mai 2014 16:53:20 UTC

1500000000 -> Fr 14 Jul 2017 02:40:00 UTC

1600000000 -> So 13 Sep 2020 12:26:40 UTC

1700000000 -> Di 14 Nov 2023 22:13:20 UTC

1800000000 -> Fr 15 Jan 2027 08:00:00 UTC

1900000000 -> So 17 Mär 2030 17:46:40 UTC

2000000000 -> Mi 18 Mai 2033 03:33:20 UTC

2100000000 -> Fr 18 Jul 2036 13:20:00 UTC

\--------------------------------------------------------------

overflow of currently used datatype for timer happening at:

code:

#!/bin/bash date -u -d @2147483648 +'%s -> %c'

output:

2147483648 -> Di 19 Jan 2038 03:14:08 UTC

info:

[https://en.wikipedia.org/wiki/Year_2038_problem](https://en.wikipedia.org/wiki/Year_2038_problem)

\--------------------------------------------------------------

other interesting date:

pi day:

date -u -d @3141592653 +'%s -> %c'

3141592653 -> So 21 Jul 2069 00:37:33 UTC (100th birthday for moonlanding
mission on pi-day/sec)

~~~
tzs
Day of week abbreviation translations:

    
    
      So Sunday
      Mo Monday
      Di Tuesday
      Mi Wednesday
      Do Thursday
      Fr Friday
      Sa Saturday
    

I think those are German.

~~~
NGC404
correct

------
roland35
How exciting! 1500000000 occured on 07/14/2017 @ 2:40am (UTC), so it seems
we'll have to only wait about 3 years until the next one!

~~~
colejohnson66
For those wondering, it’s roughly 1157.4 days (38 months)[0]

[0]:
[https://www.wolframalpha.com/input/?i=1e8+seconds+in+months](https://www.wolframalpha.com/input/?i=1e8+seconds+in+months)

~~~
YesThatTom2
The fact that it happens so consistently indicates it is some kind of
conspiracy.

~~~
bigyikes
Big Gravity is conspiring to keep us all down

~~~
User23
At least it's cheap, clean, and reliable.

------
browserface
The crazy thing is there has _only_ been 1.6 billion seconds since 1970. There
have been more babies since 1970 than there have been seconds. Which is crazy.

Back then, Earth's population was on 3.7 billion. Now it's 7.8 billion. That
means that (ignoring deaths) 2.5 babies have been born for each unix second.
Or roughly 1 baby every 400 milliseconds.

~~~
city41
I don't think that's too crazy. Seconds are serial, babies are parallel :)

~~~
iso1210
Yet while one woman can generate 1 baby in 9 months, 9 women can't do it in 1
month

~~~
browserface
Underrated comment. Reply anticipation: this is not reddit. Preemptive
response, I know, doesn't change the fact that this comment is underrated :);p
xx

Only thing I want to add: _not yet_ , maybe with CRISPR and artificial wombs
there could be some way.

------
Retr0spectrum
There's a nice countdown here:
[https://epochconverter.com/countdown](https://epochconverter.com/countdown)

~~~
wolfram74
I quite like the Mayan counting system, so I made a little module that does
the conversion and makes SVG's, this[1] is the only thing I've used it for so
far, though.

[1][https://wolfram74.github.io/ArabIntToMayaInt/countdown.html](https://wolfram74.github.io/ArabIntToMayaInt/countdown.html)

~~~
noisy_boy
Reminds me of the countdown clock on predator's hand-mounted display in
Predator[0]. Maybe they took inspiration from the Mayan counting system.

[0]:
[https://en.wikipedia.org/wiki/Predator_(film)](https://en.wikipedia.org/wiki/Predator_\(film\))

------
codingdave
I enjoyed the lunch party we threw at work when it hit 1234567890.

I worked in Corporate IT at the time, so most of the office had no idea what
we were celebrating. Somehow, neither did 80% of the tech staff. But the few
of us who appreciated it really enjoyed the cake.

------
mocar
And today is day 256 of 2020

~~~
tzs
For those who would like to be able to figure day of year from date in the
heads, here are a couple of reasonable ways.

1\. Given month 1 <= m <= 12, day 0 of that month (i.e., the day before the
first of the month) in a non-leap year is day 30(m-1) + F[m] of the year,
where F[m] is from this array (1-based indexing!):

    
    
      0, 1, -1, 0, 0, 1, 1, 2, 3, 3, 4, 4
    

Add the day, and add 1 if it is a leap year and m >= 3.

E.g., September 12. m = 9, giving 240 + F[9] = 243 for Sep 0. Add 12 for the
day, and 1 for the leap year, giving 256.

The F table has enough patterns within it to make it fairly easy to memorize.
If you prefer memorizing formulas to memorizing tables, you can do months m >=
3 with F[m] = 6(m-4)//10, where // is the Python3 integer division operator.
Then you just need to memorize the Jan is 0, Feb is 1, and the rest are
6(m-4)//10.

2\. If you have the month terms from the Doomsday algorithm for doing day of
week calculations already memorized, you can get F]m] from that instead of
memorizing the F table.

F[m] == -(2m + 2 - M[m]) mod 7, where M[m] is the Doomsday month term
(additive form): 4, 0, 0, 3, 5, 1, 3, 6, 2, 4, 0, 2.

You then just have to remember that -1 <= F <= 4, so adjust -(2m + 2 -M[m]) by
an appropriate multiple of 7 to get into that range.

BTW, you can run use that formula relating F[] and M[] the other way. If you
have F[], them M[m] = F[m] + 2 m + 2 mod 7. Some people might find it easier
to memorize F and compute M from that when doing day of week with Doomsday
rather than memorize M.

------
fsckboy
I'll be waiting to C-elebrate for 1,610,612,736* which is 122 days later or
January 13th plus or minus a day or two (I was lazy).

* 0x6000 0000

~~~
JdeBP

        % printf "@40000000%08x%08x %#xs SI since the Unix v4 Epoch\n" \
          0x6000000A 0 0x60000000 |
          TZ=right/UTC tai64nlocal
        2021-01-14 08:25:09.000000000 0x60000000s SI since the Unix v4 Epoch
        %

------
fanf2
I have a twitter bot called time_t_emit that tweets palindromic seconds since
the epoch. (somewhat inspired by the now defunct @megasecond)

[https://twitter.com/time_t_emit](https://twitter.com/time_t_emit)

It's going to tweet twice tomorrow at lunch time (in my time zone) either side
of the rollover.

I fondly remember the gigasecond party I went to with a load of friends - it
happened in the small hours on Sunday morning, ideal party time for a bunch of
geeks in their 20s, like an extra new year's eve!

------
judge2020
The exact time this happens is Sun, 13 Sep 2020 12:26:40 +0000 (you can paste
1600000000 in 'make another conversion')

~~~
PostPlummer
For those as oblivious to +0000 as I am, and wondering if that is GMT or UTC:

\- GMT is a time zone officially used in some European and African countries.
The time can be displayed using both the 24-hour format (0 - 24) or the
12-hour format (1 - 12 am/pm).

\- UTC is not a time zone, but a time standard that is the basis for civil
time and time zones worldwide. This means that no country or territory
officially uses UTC as a local time.

Source: [https://www.timeanddate.com/time/gmt-utc-
time.html](https://www.timeanddate.com/time/gmt-utc-time.html)

------
numpad0
`watch -n 1 date +%s`

Wish I knew when 1234567890 happened, I was too dumb back then.

~~~
sprash
I still remember the huge 1234567890 party. It was one of the nerdiest
gathering ever and the bar owner had no clue what was going on.

Coincidentally I consider this the end of "golden age" of the internet which
ended as soon as the iPhone and smartphones in general got mainstream and
everything moved to centralized services.

------
aljgz
Gif recording of this
[https://media.giphy.com/media/MDxNo8cSBDHYTGLdC2/giphy.gif](https://media.giphy.com/media/MDxNo8cSBDHYTGLdC2/giphy.gif)

------
qwertox
[https://epochconverter.com](https://epochconverter.com) is the first bookmark
in my bookmark bar, just the icon and no text

I can only recommend using
[https://epochconverter.com](https://epochconverter.com) instead of Google's
preferred result [https://unixtimestamp.com](https://unixtimestamp.com) as it
automatically detects if milliseconds are included, and it displays the local
time more prominently.

------
willvarfar
Tangentially related, last week I made a tiny one-file Java lib for working
with Unix timestamps in milliseconds (as returned by
System.currentTimeMillis() etc). It’s several orders of magnitude faster that
Java’s time api!

[https://github.com/williame/TimeMillis](https://github.com/williame/TimeMillis)

It’s probably not as fast as it could be: all speed ups and improvements
welcome!

------
harikb
Oh god! Don’t tell me I have to upgrade Splunk again!

------
compsciphd
weird Q I was wondering about in regards to the epoch. (perhaps a stupid Q)

what would be the problems with choosing a new epoch (i.e. one where jan 1 is
the same day of week as jan 1, 1970 and is 2 years before a leap year.
(perhaps doesn't exist).

The worse case I see is that apps that calculate years internally (instead of
via a shared library or like) would calculate them incorrectly. I'm wondering
if this wouldn't be something that could be massaged around. Of course, it
just pushes the problem down the road.

it also makes timestamps like this incompatible between systems that have
different epochs, which could be an issue.

as I said, naive (possibly stupid) Q, just wondering if people have actually
talked about it?

~~~
nitrogen
I'm guessing these would be good starting points to find out:

[https://en.wikipedia.org/wiki/Year_2038_problem#Possible_sol...](https://en.wikipedia.org/wiki/Year_2038_problem#Possible_solutions)

[https://en.wikipedia.org/wiki/Epoch_(computing)#Notable_epoc...](https://en.wikipedia.org/wiki/Epoch_\(computing\)#Notable_epoch_dates_in_computing)

------
westurner
It's gonna be so fun. In UTC:

    
    
      >>> import datetime
      >>> datetime.datetime.now().timestamp()
      1599923432.252943
      >>> datetime.datetime.fromtimestamp(16e8)
      datetime.datetime(2020, 9, 13, 8, 26, 40)

------
xwdv
Any hackernews Zoom parties happening to commemorate the event?

~~~
iso1210
It's not a round number, not like Sat Jan 10 13:37:04 UTC 2004

~~~
xwdv
What were those parties like?

------
eatbitseveryday
Sorry for sounding ignorant, but why is this noteworthy?

------
tus88
Avoir 15, you were my favorite time prefix so far :(

~~~
teddyh
> _Avoir_

You probably meant “Au revoir”. Then again, “Au revoir” means roughly “Until
we meet again”, and we won’t be seeing the “15” prefix again.

------
Donckele
Thats Sunday, 13-Sep-20 12:26:40 UTC in RFC 2822

------
joshxyz
What a ride fellas

------
ezekiel68
From the, "I don't know who needs to desperately update their automated
scripts..." dept. :rofl_emoticon:[0]

[0] Yes, I know this doesn't get transformed here.

------
zaroth
Only the 9th time that’s ever happened!

------
rektide
for my tribe counting in the best epoch this event commemorates the passing of
0x5F5E1000

~~~
rektide
saved you two place digits to keep track of

------
sc_
Crazy how nature do that

------
tester89
Going from 1,5 Gs to 1,6 Gs

------
Andrew_nenakhov
I make a bold prediction that we'll run out of digits rather soon.

~~~
Bnshsysjab
2037 is a potential overflow, I believe. I imagine only pre 2000 systems would
likely be affected.

~~~
xyzzy_plugh
Only pre 2000 systems? I'd love to live in your dream world.

Last I looked, plenty of fixes were trickling into the kernel in 2014. I
wonder how many of those made the long backport to stable.

Let alone all those 2.6 kernels (and older!) in the wild. And all your 32-bit
devices are probably gonna have a bad time.

~~~
marvy
Kernel will hopefully be ready; user space is another story.

