This is something I find tremendously useful in programming, but at the same time find a lot of other developers amazed when it's used.
I don't care if the dataset in memory is 553MB or 632MB - what I really need to know is whether it's "a few tens of MB", "a few hundreds of MB", or a "a few thousand MB".
I don't care if the API server can service 7321 simultaneous requests or 6578 - I just need to know if its "a few hundred", "a few thousand", or "a few tens of thousands".
You can solve an enormous number of engineering and architecture problems with a reliable order-of-magnitude estimate - at the very least you can quickly exclude solutions that are vastly under (or over) provisioned for the problem you're trying to solve.
A good order-of-magnitude estimate is also a great error check for a more detailed calculation, if my quick estimate said "5000-ish plus or minus 50%", and your calculation says "24,152", one of us has got something wrong.
I don't care if the dataset in memory is 553MB or 632MB - what I really need to know is whether it's "a few tens of MB", "a few hundreds of MB", or a "a few thousand MB".
I don't care if the API server can service 7321 simultaneous requests or 6578 - I just need to know if its "a few hundred", "a few thousand", or "a few tens of thousands".
You can solve an enormous number of engineering and architecture problems with a reliable order-of-magnitude estimate - at the very least you can quickly exclude solutions that are vastly under (or over) provisioned for the problem you're trying to solve.
A good order-of-magnitude estimate is also a great error check for a more detailed calculation, if my quick estimate said "5000-ish plus or minus 50%", and your calculation says "24,152", one of us has got something wrong.