
Computer scientists need to learn about significant digits - joeyespo
http://lemire.me/blog/archives/2012/04/20/computer-scientists-need-to-learn-about-significant-digits/
======
spullara
I thought that this article was about having a number type that supports
significant figures and propagates them through all calculations
appropriately. That would be pretty useful. Instead it is a complaint about
benchmarks it seems, which is valid, but not obviously in the domain of all
"computer scientists" but rather people that run micro-benchmarks and report
results without care.

------
Yarnage
Not sure what the point here is; this is all relative.

Sure, .14MB isn't a big deal but if you're uploading 2.14TB of data, I need to
know about that .14.

~~~
pan69
I think he's trying to explain that .14 on 300.14 is negligible where as on
2.14 it might mean a lot.

However, I do not fully agree with his statement either. If something takes
300.14ms the .14 is a pretty useless detail. But if I now do this operation in
a loop of 1 million it suddenly adds up to a lot. Knowing 300.14 was
definitely worth while.

I guess it's all about context.

~~~
wtcurtis
It's not a question of whether that .14 is negligible. His point is that it's
literally meaningless.

Say you're benchmarking some routine. One particular measurement may be 300.14
ms. The next may be, say, 302.56 ms. The one after that, 297.12 ms. You have
no way of knowing whether that .14 ms is due to the implementation of the
routine, or whether it's simply noise in the measurement.

