Hacker News new | comments | show | ask | jobs | submit login

> What if your manager asked you for an estimate of how long it would take to load a million new records into the database and index them? It would be nice to know whether you'd have to take the system down for an hour or for a day or for a week.

I don't think I really buy this as a practical use case. Unless I was running on system I understood very well (eg. an embedded platform) I don't think I'd really trust estimates calculated using Big-O to be reliable on atypical workloads. I'd pretty much expect unforeseen caches / optimizations / swapping to disk to confound my estimates somehow. If at all possible I'd test it. If testing it takes too long to be practical then there's a serious problem with the system. I can see cases where testing is impossible for good, practical reasons, just not enough of them to justify the obsession with Big-O.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact