Hacker News new | past | comments | ask | show | jobs | submit login

What is O(n log n)? Time complexity? What about memory requirement? Last month I decreased memory requirement from 16GB to 100MB in one implementation of a "stable&fast" algorithm. Really, using a function just because it's O(n log n) without understanding its characteristic is like a blind surgeon randomly amputating limbs because maybe it will help. But that's fine for me, I have more work then, fixing software bugs made by incompetent* programmers is quite nice.

*incompetent - those who think they know something and are not willing to learn anything new




> What is O(n log n)? Time complexity?

And then worst case? Best case? Average case? Which applies to random input? Nearly-sorted input? What's the overhead for a "short sort", can/should I use this to sort small sequences in a tight-ish loop or is it only for large sequences?


And then worst case? Best case? Average case?

I was under the illusion that Big-O was always asymptotic complexity (ie worst case) and that other notations (little-o, big-omega, big-thetha etc) were used for best/average/etc case. Perhaps I'm wrong, however.

Which applies to random input? Nearly-sorted input? What's the overhead for a "short sort", can/should I use this to sort small sequences in a tight-ish loop or is it only for large sequences?

Indeed. The details matter.

My favourite example is how a naive linear search can perform better than a non-linear search with a better (in O() terms) algorithm if, for example, the linear search can do most of its work in cache (eg small input sizes or otherwise regular access patterns (predictable for prefetch)).


> I was under the illusion that Big-O was always asymptotic complexity (ie worst case) and that other notations (little-o, big-omega, big-thetha etc) were used for best/average/etc case. Perhaps I'm wrong, however.

No, asymptotic notation only refers to the behaviour of the function (or algorithm) as you take the limit N->inf, e.g. changes to the size of the input. But the nature of the input often changes the behavior as well.


Thanks for the correction!


You can have Big-O of worst case and Big-O of average case. "Asymptotic complexity" itself does not imply worst case; the assumptions you make to construct the function itself determine worst case or average case.


Why do you assume the commenter doesn't understand the time and memory characteristics of quicksort?


Nobody mentioned quicksort. You're assuming quicksort, which is what the issue is. Many languages don't actually use quicksort, fyi. The practical performance considerations of just quicksort itself also change depending on how you've implemented the small details - are you moving memory around, or just changing pointers? Huge difference in the real world, no difference in the algorithm.


I oversimplified my response and it lost the point I was making. You are certainly correct that many languages don't use quicksort. My bad.

I was responding to a comment that I felt was needlessly dismissive. The default sort algorithm of any major language's standard library is usually not worth trying to beat, and calling someone "incompetent" for not reinventing the wheel each time they encounter data that needs to be sorted is absurd. Note: I said "usually", not "never". A competent programmer doesn't fix problems that don't exist, and you would absolutely have to profile code before convincing me that the choice of sorting algorithm was the source of a performance problem.

As far as the details go... In only a few languages (e.g. C) might you really even consider moving memory around rather than pointers. The semantics of standard library sort in Java, C#, Ruby, Python, Javascript... is sorting an array of primitives by value or objects by reference.


There is no quicksort mentioned.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: