Interesting that Norton seems to use Unix timestamps. I’ve never developed for Windows, is it common for Windows devs to use them too? Or just some niche feature causing a more widespread problem?
Unix timestamps are common in windows software, though the standard timestamp used by the operating system is the number of ticks (100ns) since 1601-01-01 UTC.
I vaguely remember a few different timestamp formats in use in different places, but the 100ns-tick is very common in Microsoft APIs.
Chuckling at the thought of people who things around the year 60040 getting nervous... nobody knows how to deal with Windows or C any more, but much essential software has run for 58000+ years, and it's about to crash.