Hacker News new | past | comments | ask | show | jobs | submit login

In finance, US dollars are generally stored to four decimal places, because you need to deal with stuff like compounding interest or stock splits.

COBOL has a built in fixed point integer type, which makes defining a 4 digit decimal and doing math on it easy. (IBM designed it from the ground up to cater to people with a lot of money, who spend a lot of money, to work with lots of money, ie banks) Java has the BigDecimal type, which is a class in the class library, which means you need to import it. And because Java lacks operator overloading, doing calculations is tedious.

In the 90s, there was a huge push to replace COBOL with <something else>, and Java was the Rust of its day, so that's what everyone got behind. However, 4 digit COBOL decimals apparently round differently than 4 digit Java BigDecimals, so all the tests failed. And all the stuff like a\x+b had to be written like BigDecimal.add(BigDecimal.multiply(a,x),b) so development was taking forever.

Eventually they said "fuck it" and 20 years later we're still stuck with COBOL and everyone who remembers the original death march says "never again".

I have a feeling a lot of the problems came down to computer science people thinking money has two decimal digits but domain knowledge people knowing it has four. We programmers, as a group, make a lot of assumptions about other peoples' domains and we're wrong a lot*.






I've had the thought that programmers should note assumptions in flagged comments, and those comments should be automatically collected, and then reviewed occasionally. Assumptions might be sustainable, so to speak, but they can also create one kind of technical debt.

> make a lot of assumptions about other peoples' domains and we're wrong a lot

What do you mean this person has no surname? That's unpossible, surname is never null, error error.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: