Would love to know if and why it is hard to implement this kind of feature for 1P. Any takes?
Also there is a heated debate that the UK is doing dates "the right way" and the US is wrong. Other than that people from Europe seem to think all of their practices are better than others, what is the possible reasoning that the order of numbers in a date string could have a right and wrong solution?
> Other than that people from Europe seem to think all of their practices are better than others, what is the possible reasoning that the order of numbers in a date string could have a right and wrong solution?
The argument is that ordering the numbers based on the size of units makes things easier and more useful. Both DD-MM-YYYY and YYYY-MM-DD follow this pattern, and both have their advantages/disadvantages. The latter is especially useful due to alphanumerical sorting.
Apart from sorting advantages it makes the format less confusing, since it's just about remembering ascending/descending vs. three individual positions. If I see a DD-MM-YYYY date or YYYY-MM-DD I know which one it has to be. If I see MM-DD-YYYY I have to be told, or I will misunderstand as long as the date is ambiguous.
Thank you for this comment and many others in this thread. I feel I've been converted to the YYYY-MM-DD structure. Because it's in order of specificity. It feels like it saves time and potentially compute.
However, I think a big problem with this is the way people say dates colloquially. Such as, March 24th 2019. Or even the 24th of March, 2019.
I can't imagine saying to someone, oh can you come over to my party, it's 2024 September 5th.
In many ways digital is designed to be a facsimile of analog. So sometimes what "makes sense" to people hyper fixated on the topic are simply not amenable to real society.
So that's where DD-MM-YYYY comes in. In my language people always say dates in that order, and even when taking English I'll refer to "the 24th of March 2019". YYYY-MM-DD is, strangely enough, kind of just "a bunch of numbers" to me. If someone tells me a date in that format it'll take a second to realise it's a date. But for digital storage it's highly useful! I'm fine switching between the two formats based on the situation. The length of the year specifier makes this unambiguous.
There is no correct answer. Having the segments go from smallest to largest or vice versa seems more logical than mid-small-large, but frankly iso8601 is clearly the most logical we have, but we’re never going to get everyone to use that outside of computer programming contexts.
Lexical sort ordering without parsing the date string is a great property.
But clearly in human terms it’s never going to be a solved issue! Nobody is looking for a solution, and everyone likes their own convention. Mutual understanding is what’s important.
The fact we have divergent date string formats is most frustrating because for many dates (up to the 12 of every month) it is ambiguous which format is in play.
> Lexical sort ordering without parsing the date string is a great property.
Why? Why is the date even a string in the first place? It ought to be internally represented as 3 numbers. Or a Unix timestamp. Convert it to months, days, and/or years as appropriate right before printing.
Never use unix time stamp as date. You don’t know time zone, you need to calculate leap years (both days and seconds), summer time etc such a “date” would immediately morph into useless random number
While I am a strong supporter of ISO8601 (e.g. 2023-07-12) there is an argument that decreasing specificity is preferable (as in ISO8601) and that for most day-to-day usage the year is somewhat implicit, hence being at the end.
Personally, I write the month out if I'm not using 8601 formatting to avoid any ambiguity.
For any sort of recordkeeping, though, I think it's preferable to go full on YYYY-MM-DD. It's more thorough, precise, and sorts properly on a computer.
In most contexts I would never use the cardinal number "3", it'd always be "the 3rd of May" or "May (the) 3rd", but in a country like Australia (with a high number of speakers from different cultural backgrounds) you hear all sorts of conventions used in casual conversation.
Doing a bunch of slashes like this is ambiguous. But the footgun that really worries me is the international divide over whether a decimal separator should be a comma or period. 10,000.00 or 10.000,00 [1] ! Which hasn't footguned me yet, but I'm just waiting for someone to complain about a 100x unexpected financial impact of whatever they bought, because it was only supposed to be 10.00 not 10,000. Yes, I know there's an extra zero there to rescue, but still.
Tech debt, a bunch of fundamental code that is hardcoded to process dates in the one format and would completely break everything if you passed it a date formatted otherwise. Dates are hard.
If for some reason you had to handle dates as strings everywhere and parse them yourself, then sure. But you should be able to pass and store Date objects or Unix timestamps internally, and then call established helpers to display them.
Imagine having middle endian values in a processor. You're reading "1,337" as 3317.
It's nonsensical.
Big-endian yyyy-mm-dd is the most rationally consistent since it's how we write numbers normally. 1,337 is big endian: the 1 is the most significant (biggest) number, etc.
Also there is a heated debate that the UK is doing dates "the right way" and the US is wrong. Other than that people from Europe seem to think all of their practices are better than others, what is the possible reasoning that the order of numbers in a date string could have a right and wrong solution?