Hacker News new | past | comments | ask | show | jobs | submit login

> (...) until we stop pushing floats as the standard solution for dealing with decimals.

I'm curious, in that sentence, who is "we"? I cannot imagine a scenario where someone would be so technically involved to be aware of the difference between a floating-point and fixed-point decimal, but still decide to use floating-point in an accounting environment.

Are you talking about project managers? Architects?




If you Google how to manipulate decimal values, or you type `rate = 1.23`, you're almost always going to get floating point. `1.1 + 2.2 == 3.3` yields false in most commonly used programming languages, with the notable exception of SQL.

That's what it means that we've established floats as the standard. You have to specifically choose not to use floats, look up your language's entirely non-standardized approach for representing exact decimals, and then maintain constant vigilance against entirely intuitive things like decimal literals.


Sorry, that wasn't very clear.

I knew well enough, but the system was already approaching a million lines of semi-working floating point workarounds by the time. And since it was all my crazy idea, I would have been held responsible for every single rounding error from that point on, had I managed to get the ball rolling at all. Life is too short and too precious.

I was thinking language designers mainly; but user preferences are shaped by habits, which means that anything but floats will take more effort to communicate and popularize.

edit: Observe how popular this comment is for even floating (take that!) the idea. Plenty of people have serious issues when it comes to floats, that much is obvious. I'm guessing its mostly avoiding change at any cost, so good luck to us all.


Language designers, I would guess.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: