I put date fields that accepted 5 digits in a data entry form in like 1997; by request: the customer was very concerned about Y2k.
They asked for that to be changed back to 4 digits within 2 weeks of deployment. The people actually using the software demanded the ability to enter 2 digit dates and have the system fill in "19" or "20" as appropriate too.
Of course that was what I'd written in the first place, so the actual work required was going and changing the STOOPIT_DATES flag back to 0.
I was honestly surprised, because normally limits for binary digits don't coincide with decimal limits. DateTime.MaxValue is 0x2BCA2875F4373FFF.
Looking in the source for DateTime, internally it's a 64-bit unsigned integer. The lower 62 bits are used for a date, and the upper 2 are used to flag local/utc/dst.
The 62 bits technically allow for an extra ~81 days into the year 10,000.
(Makes me wonder if, in 7950 years or so, someone's going to recompile a bunch of C# code to use an updated version of DateTime.)
> DateTime.MaxValue is 0x2BCA2875F4373FFF.
> The 62 bits technically allow for an extra ~81 days into the year 10,000.
Just eyeballing it from starting 0x2B... which is about 2/3 of 0x3F... which would be the maximum value of 62 bits, I'm pretty sure you've miscalculated!
Looking at the docs, 0x0 represents 1/1/1 AD, so that means the 62 bits can suffice for almost 5000 years beyond DateTime.MaxValue. So, people's code will still work for a little while afterwards, even if they aren't printed correctly!
But anyway, it's all kind of a moot point. 8000 years ago is about the limit of known human civilisation, so it's unlikely that any of the technology we create today will still be considered important the same distance into the future.
Not only has every country on earth gone through at least one entire calendar reform in the last 8000 years, and many countries that use the concept of DST feel the need to change the dates when it occurs every few years, the chance of this being an issue for anyone is approximately 0.
Yeah, it was a throwaway Linqpad script on my work computer, so I can't go back and see what screwed up. I rewrote it on my Mac, which doesn't have Linqpad:
using System;
namespace DateTime_Headroom
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine( "DateTime.MaxValue in hex: 0x" + DateTime.MaxValue.Ticks.ToString("X"));
const ulong DATETIME_RESERVED_BITS = 0xC000000000000000;
Console.WriteLine( "DateTime reserved bits in hex: 0x" + DATETIME_RESERVED_BITS.ToString("X"));
const long DATETIME_TICKS_BITS = (long)~DATETIME_RESERVED_BITS;
Console.WriteLine( "DateTime ticks bits in hex: 0x" + DATETIME_TICKS_BITS.ToString("X"));
var extraTicks = DATETIME_TICKS_BITS - DateTime.MaxValue.Ticks;
Console.WriteLine( "DateTime extra ticks in hex: 0x" + extraTicks.ToString("X"));
Console.WriteLine($"DateTime extra days: {extraTicks / TimeSpan.TicksPerDay}");
}
}
}
----
DateTime.MaxValue in hex: 0x2BCA2875F4373FFF
DateTime reserved bits in hex: 0xC000000000000000
DateTime ticks bits in hex: 0x3FFFFFFFFFFFFFFF
DateTime extra ticks in hex: 0x1435D78A0BC8C000
DateTime extra days: 1685540
Edit: Sorry about the formatting.
Also, kinda makes me wonder about the tradeoffs. I suspect that the performance advantage of 64-bit math over "bigint" math is much more useful than the risk of recompiling a program in 10,000 years to use a 128-bit date.
I stopped at Visual Basic 6. I inherited a little bit of VB.Net back when C# was new and I honestly don't get the point of the language.
A lot of the simplifications that were in classic Visual Basic just aren't in the .Net version. It was kind of useful if you really needed weak typing, but that's really a niche usage that duck typing in C# now handles.
I was a C/x86 developer who switched to VB for an ASP job. Then I became a VB.NET devotee. I kept up with C# the whole time because 99% of sample code was in C# and I needed to understand it. I actually still believe VB.NET is the better language, I think it reads a lot easier and is therefore less susceptible to errors. I've given up though and switched to C# just because the support is there.
> so it's unlikely that any of the technology we create today will still be considered important the same distance into the future.
Let's say the wheel, or reproductible fire, or written language, probably happened around 8,000 years ago - I'd say that's pretty danged important and should be memorialized.
Allows for 8000 future years, but does not allow for negative year. Although once you get into negative years I guess you'd need a range of almost 14 billion just for the year value.
If we assume that in 8000 our religious landscape and scientific knowledge will be substantially different (hint: it will), we’re going to need a few more digits.
Already today we can use the Holocene calendar https://en.m.wikipedia.org/wiki/Holocene_calendar but in 8k years we might have an accurate and useful measure to specific points that are easier to work with / more valuable.
Perhaps the unix epoch might even be the most significant start time.
Dream, it'll be the birth year of some random new figure, a large event like the arrival of the aliens or the nuclear bombing of Taipei, anything else than the "UNIX" epoch. 8000 years you imagine, it's 4 times my language's age, ffs. I barely know what Unix is, and I don't expect to ever tell my children there was something before Linux lol.
I feel like humanity has gotten over novelty calendars. They were popular in the past, but now that we’re all connected over the internet people prefer compatibility. The only thing wrong with Unix time is that by default it ticks backwards when a leap second occurs. Smearing makes that bearable.
Maybe things will change when the max distance between humans is >>0 light years.
Leap seconds are bizarre, it matters little that the sun is so precicesly on Grenwhich meridian at 12:00, anything that cares about that second can add or subtract potentially milliseconds why is one se ond significant and one millisecond not.
Shifting days off summer is one thing but shifting seconds off noon has no practical benefit and a huge cost.
The new epoch will probably have time 0 as being the instant when we accidentally blow up the earth, adjusted for wherever we are in space by however long it takes the light from earth to get there. Assuming we have some lucky ones who do manage to escape the earth before then.
Obligatory reference to 'A Fire Upon the Deep' [0] by Vernor Vinge, in which programmers-at-arms on board starships do indeed trace their ship clocks back to ancient computer systems and Unix in particular
People may not expect to deploy CODE from 2000 CE in 10000 CE
But hardware designs and long-term ideas from current era might survive maybe?
For example the space probes we send out over the next few millennia -- or any SETI schemes that involve instructions to an alien species about how to communicate back to us -- better be robust enough to handle the dates beyond 4 digit years.
And thinking on those time-scales might open up our minds to other aspects of design that might be otherwise overlooked.
This is one of those ideas that is nerdy fun and I love it in that regard.
But putting a zero in front of isolated year fields is kind of silly when we don’t do the same for days or months. Nobody writes “January 01, 2022” so it would be nonsensical to start writing “January 1, 02022”
Zero-padded numerics are often used where lexical sorting is expected.
Eg, "2021-12-30_author-of-file_title-text" (or similar) is a naming convention I use with some frequency.
That is more useful if you use "2022-01-01" than "2022-1-1" for sorting.
(Or if we're keeping in the Long Now spirit, "02022-01-01".)
A directory full of such entries sorts naturally by the date in the filename itself. Given vagueries of Linux access / modify / change / birth datestamps, this has some utility.
(There are those who argue against putting metadata in filenames. I've come to favour putting relevant and robust metadata in filenames.)
Many systems have code in place to set the range when a partial (i.e. 2-digit) year is entered. Call it a two digit year cutoff or century pivot, it works, as long as it gets updated by someone every couple decades or so.
Only having two-digits for the year can be problematic for some organizations who need to keep DECADES of data/records, but I can't imagine almost any will need to keep millenia worth of records preserved... So I'd say a century or millennia cutoff/pivot will suffice.
How long do we need to keep track of a bunch of dinosaurs' credit card transactions, anyhow?
With apologies to geologists, archaeologists and astronomers. 5-digit years won't be enough for them, anyhow.
"Australian man ‘cannot leave Israel for 8,000 years’ over unpaid child support. Noam Huppert says he is subject to travel ban until the year 9999 because he owes £1.8m to ex-wife. ... It appears the year 9999 was arbitrarily set because it was the highest possible date allowed by the online system."
The Long Now approach of padding the present pattern with an additional zero tends to buy an increasingly large margin for adaptation at a proportionately smaller cost over time.
Following the year 09999, prepend an additional 0.
Reminds me of the Holocene Calendar, which argues for setting the year 0 ten thousand years earlier to include all of major human history. So instead of writing 02022 AD it would become 12022 HE, for Human Era. It would mean to update these date libraries fast though...
The premise of this seems excessively optimistic -- and/or excessively egoistic. It's plausible that humans will still exist on Earth 8,000 years from now, but that requires fantastic optimism (as in it requires fantasy to predict not only the future you can foresee/imagine in your likely lifetime, but for many, many millennia and lifetimes beyond your own). On the other hand, assuming humans (in whatever new form) are inhabiting this earth are still here, starting to timestamp your works (writings, etc) with 5 digit years says you really expect your outputs to be in existence then. Though they very well might be, does one think that without your foresight to use 5 digit years it would maybe somehow be lost, or difficult to archive?
Unix's time limit is based on the capacity of 32- or 64-bit integers (hardware). For denoting years, we lost nothing because the ancients didn't write dates as 4 character years -- we, and that info, survived adding digits.
I'm optimistic that we'll be around in 8k years, but less so that anything written now will survive regardless of how we record timestamps. Posts regularly come up here of sites shuttering with little to no notice and we're increasingly seeing a migration to app walled gardens which are actively hostile towards open standards and by extension archiving. The ancients were on the right track carving their histories into stone.
I'm even more optimistic; the Sumerian language is now 5000 years old we can read that. I used to love stone runes but after reading Snow Crash clay tablets that the sumerians used seems to be the best path, it's so much faster and practical. A bit like the internet vs. floppies.
An absolute system of tracking years needs a datum point, which is arbitrary. Using a traditional one at least confers the benefit of being able to relatively easily read dates as written in the last few hundred years.
Religious people understand the value of sticking to the now arbitrary choice since Jesus wasn't born on 0001-01-01. I assume areligious people can, too.
Areligious, atheist, etc. Presupposes that religion, dogmas, sects are the default state of being. It isn't. Urban is, I think, a more accurate description when it is defined as anti-sectarian.
I agree with you the year's got to be arbitrary in practice, however I don't understand how your statement answers my concern about using mythology, can you elaborate on that?
True, but people don't commit crimes against humanity in the name of any arbitrary mythologies, only certain ones. We shouldn't propagate those kind of myths any further.
Technically, many of them are named after famous scientists, so the possibility is certainly there. Who knows what people will think of our Lord Kelvin 10,000 years later. Maybe eventually we will have to change those names, too, if they become dirty, rightfully or not.
They asked for that to be changed back to 4 digits within 2 weeks of deployment. The people actually using the software demanded the ability to enter 2 digit dates and have the system fill in "19" or "20" as appropriate too.
Of course that was what I'd written in the first place, so the actual work required was going and changing the STOOPIT_DATES flag back to 0.