The Reinhart-Rogoff issue was technically an error in Excel, but also an error by the authors for not actually verifying the results before publishing. It didn't hurt that their particular biases were in line with the results.
The technical problem can be addressed with more warnings and safeguards, but they are meaningless if no one uses them.
I hadn't previously read up on the RR issue. But after some surface level research, I would not say it was Excel as an issue. It sounds like the tool did exactly what they programmed it to. It seems like human error or choices they made to arrive at the conclusion they wanted; which seems to be speculated (or true, I only scratched the surface).
> While using RR’s working spreadsheet, we identified coding
errors, selective exclusion of available data, and unconventional weighting of summary statistics. [0]
I'm not a fan of tools giving warnings for these types of "coding errors". Although a warning I can think would be nice is where math just doesn't work as expected. The recent floating point discussion [1] seems appropriate as it's just not very intuitive and as a programmer you need a pretty deep level of understanding to know that the resulting math is likely not accurate. But, it also seems to effect nearly every programming language and is not a quirk of one specific thing.
I'd be interested to read more if you have info outline the actual error within Excel. If there is some 2+2=5 situation, I'd be interested to learn about that. I feel like every time someone says "Excel error", it's actually "human error". It would be like if every car accident was a vehicle malfunction but we all know it's most likely an operator issue.
The Reinhart-Rogoff issue was technically an error in Excel, but also an error by the authors for not actually verifying the results before publishing. It didn't hurt that their particular biases were in line with the results.
The technical problem can be addressed with more warnings and safeguards, but they are meaningless if no one uses them.