Computers can be made to represent vagueness rather precisely.
> You should be able to represent in your data model, date-time library etc. vague dates and relative dates.
If it is needed. Often it is not needed. Or rather for most applications there is an implicit hard coded resolution, or error interval.
The library case is special and then why not just have an additional byte specifying the confidence interval. You only need a small enumerated type of 20 distinct values to encode confidence intervals from a nanosecond up to a millennium. Heck you can do it with bits if you want to.