Hacker News new | past | comments | ask | show | jobs | submit login

Good catch, yeah that is pretty terrible.

It just seems silly that I'm building some PoC, and 3 years later all the assumptions I applied initially become something I need to actively design around, or just refactor my data schema. How many layers of abstraction would I need to add to create a kind of more flexible types that just know to expand if I add more significant figures, or contract (on a row basis) depending on the entry.

Like I was building a django app 2 weeks ago, and pulled in django-money to deal with currencies without reinventing the wheel, and then with another field on a model (meant to simulate a crypto asset), I had to arbitrarily decide what level of precision and what the "max digits" were for a class of potential instances of this model. I get that this might be overoptimization, but really - this is silly. By specifically asserting a max length for some field, aren't you wasting space? If not, then my entire point is moot.

Not super quantitative, but just thinking out loud a bit.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: