*incompetent - those who think they know something and are not willing to learn anything new
And then worst case? Best case? Average case? Which applies to random input? Nearly-sorted input? What's the overhead for a "short sort", can/should I use this to sort small sequences in a tight-ish loop or is it only for large sequences?
I was under the illusion that Big-O was always asymptotic complexity (ie worst case) and that other notations (little-o, big-omega, big-thetha etc) were used for best/average/etc case. Perhaps I'm wrong, however.
Which applies to random input? Nearly-sorted input? What's the overhead for a "short sort", can/should I use this to sort small sequences in a tight-ish loop or is it only for large sequences?
Indeed. The details matter.
My favourite example is how a naive linear search can perform better than a non-linear search with a better (in O() terms) algorithm if, for example, the linear search can do most of its work in cache (eg small input sizes or otherwise regular access patterns (predictable for prefetch)).
No, asymptotic notation only refers to the behaviour of the function (or algorithm) as you take the limit N->inf, e.g. changes to the size of the input. But the nature of the input often changes the behavior as well.
I was responding to a comment that I felt was needlessly dismissive. The default sort algorithm of any major language's standard library is usually not worth trying to beat, and calling someone "incompetent" for not reinventing the wheel each time they encounter data that needs to be sorted is absurd. Note: I said "usually", not "never". A competent programmer doesn't fix problems that don't exist, and you would absolutely have to profile code before convincing me that the choice of sorting algorithm was the source of a performance problem.