Of course, I'm just not clear how Big-O helps with that. To me, the required insights seem to be:
* Things are slower when the computer has to do more work
* A feel for what strategies result in the computer doing more work
I can't think of many cases when saying "This algorithm is O(N^2)" is significantly more useful than saying "This algorithm is going to be really slow".
In a web dev context, load testing. I appreciate that that's not so easy if your program takes a week to run.
> Is very useful to be able to time some code on a small input set, then extrapolate based on knowledge of time complexity to find out roughly how long the full input will take. If you don't know the time complexity, this is impossible.
I suppose my confusion is that with code I've written, I think I generally understand the time complexity without having to use Big-O notation. I can sort of see that there are times you might want to tell a colleague something like "careful, this code I wrote is O(n^3)", but I'm wondering what kind of job entails doing that on a regular basis.
But to know that it will be slow you have to know how it scales with the input size. Because the list example may work just fine when the website has little data (oftentimes an O(N^2)-algorithm may even be faster than an O(N)-algorithm on small data) and hit you when you scale up. And the O-notation gives you a tool to estimate how the work scales. Any alternative to the O-notation is either more complex (look up Analytic Combinatorics http://ac.cs.princeton.edu/home/) or just an informal treatment of O-notation.