Hacker News new | past | comments | ask | show | jobs | submit login

I've met plenty of coders who aren't even aware of the problem, and I spent 13 years maintaining and extending an accounting system written by people who couldn't tell a floating point issue from their ass on a good day.

It's not going to happen much until we stop pushing floats as the standard solution for dealing with decimals. I'm pretty sure they are more commonly abused than used at this point, and most cases would be better off with fixpoints, rationals or bignums in increasing order of flexibility and complexity.

Lisp and Perl6 get it more or less right. My own baby, Snigl [0], only supports integers and fixpoints so far. I might well add floats eventually, but I intend on hiding them well enough to not encourage anyone who isn't motivated.

[0] https://gitlab.com/sifoo/snigl

> (...) until we stop pushing floats as the standard solution for dealing with decimals.

I'm curious, in that sentence, who is "we"? I cannot imagine a scenario where someone would be so technically involved to be aware of the difference between a floating-point and fixed-point decimal, but still decide to use floating-point in an accounting environment.

Are you talking about project managers? Architects?

If you Google how to manipulate decimal values, or you type `rate = 1.23`, you're almost always going to get floating point. `1.1 + 2.2 == 3.3` yields false in most commonly used programming languages, with the notable exception of SQL.

That's what it means that we've established floats as the standard. You have to specifically choose not to use floats, look up your language's entirely non-standardized approach for representing exact decimals, and then maintain constant vigilance against entirely intuitive things like decimal literals.

Sorry, that wasn't very clear.

I knew well enough, but the system was already approaching a million lines of semi-working floating point workarounds by the time. And since it was all my crazy idea, I would have been held responsible for every single rounding error from that point on, had I managed to get the ball rolling at all. Life is too short and too precious.

I was thinking language designers mainly; but user preferences are shaped by habits, which means that anything but floats will take more effort to communicate and popularize.

edit: Observe how popular this comment is for even floating (take that!) the idea. Plenty of people have serious issues when it comes to floats, that much is obvious. I'm guessing its mostly avoiding change at any cost, so good luck to us all.

Language designers, I would guess.

Do you mean binary fixed point, or decimal? Because for money matters I'll take decimal floating point over binary fixed point.

I strongly prefer decimal fixpoints as well, despite being slower; that's what I'm doing in Snigl.

Crikey, floating point in accounting software :(

Integers with implied decimals is the way to go.

I used to tell myself that our company was the exception.

But in 30 years, I have yet to come across a piece of corporate software that's not crap. Established companies, consultants, startups; all crap. Different kinds of crap, it's not all rounding errors; but none of it would survive in most open source projects.

Because quality is only an issue as far as it increases short term profits, and that's not a very successful heuristic for writing good software.

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact