Hacker News new | past | comments | ask | show | jobs | submit login

That's cool. I didn't realize that those languages used arbitrary-precision integers by default. I know that many language offer a bigint but to me the difference between having bigints and them being the default seems significant. For instance, in JavaScript the `n` notation and them being called `bigint` (and not just `int`) is going to mean that they will get used very rarely.



Yeah i absolutely agree. The saddest cases for me are languages like Java, which comes with working arbitrary-precision integers on its standard library, but using them feels so second-class compared to the "primitive" types like `int`. You need to import the class; initializing a value looks like `var a = new BigInteger("39")`; they don't support normal math operators, so you need to do `a.add(new BigInteger("3"))` just for a simple addition; and so on.

It's difficult to argue that arbitrary-precision integers are "simpler" (because they don't have an overflow edge-case on almost every operation) when they are so much more inconvenient to use.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: