Hacker Newsnew | comments | show | ask | jobs | submit login

> Computers tend not to.

Computers can be made to represent vagueness rather precisely.

> You should be able to represent in your data model, date-time library etc. vague dates and relative dates.

If it is needed. Often it is not needed. Or rather for most applications there is an implicit hard coded resolution, or error interval.

The library case is special and then why not just have an additional byte specifying the confidence interval. You only need a small enumerated type of 20 distinct values to encode confidence intervals from a nanosecond up to a millennium. Heck you can do it with bits if you want to.




I'm not saying you can't do it. I'm saying that a lot of the software one writes against (date-time libraries, databases, data formats etc.) don't make it easy. And programmers don't think about it very hard.

-----




Guidelines | FAQ | Support | API | Lists | Bookmarklet | DMCA | Y Combinator | Apply | Contact

Search: