Rust is an interesting case. It's intended as a systems programming language, so performance is considered important. I guess I can't say flatly that it should have arbitrary-precision integers by default, though maybe it could. But at least it reminds you, every time you write an integer type, that it's bounded, the length being right there in the type name ("i32", "u64", etc.)
If that can be avoided it makes no sense to use the slow and wasteful bigints.
Yes, this is slower than ordinary fixed-width integer arithmetic, as a couple of conditional branches are required, but it's nowhere near the price of heap-allocating every integer.
add %rax, %rbx
This article makes a false comparison. When software is part of safety, such as in aircraft, it gets the same level of scrutiny.
Safety? For the most part, yes. Security? Not so much.
There's no one-size fits all. Speed and accuracy tend to have an inverse correlation.
His criticism still applies to today's projects: recently I've seen fixes in imagemagick done as minimal "patches".
I just inherited a system that uses trim everywhere on everything because originally they didn't sanitise the inputs.
Rather than hunt down all the inputs they just did trim(foo) == trim(bar) making it worse is that it's PHP and so not only do they trim things but they type-coerce them, good luck trying to reason about that 7 layers deep :|
Inaccurate speed is only of limited usefulness, but the bigger problem is when you don't get either.