Everyone has already realised that 32-ish bit values are too little to represent time with either sufficient range or precision, I wouldn't blame the issue on 31-bit integers specifically (that one bit hardly buys you much time), but on the short-sightedness of whoever chose to use them for that purpose.
A 63-bit integer would give you 145 years of range at nanosecond resolution; if you chose millisecond precision instead, you could accurately timestamp the death of the last mammoth.
This was just one example of where 31-bit ints bit you in NewtonScript. But there were many more, especially when dealing with interoperability with the OS, which was in C++.
A 63-bit integer would give you 145 years of range at nanosecond resolution; if you chose millisecond precision instead, you could accurately timestamp the death of the last mammoth.