Hacker News new | past | comments | ask | show | jobs | submit login

I'm a bit sceptical because I don't see any mathematical proof. Only benchmarks which do not prove a lot. It may be faster only by a constant factor on a particular machine but for sufficiently large n it would be as fast as mergesort.

1000000 is peanuts. We need to see convergence for about billions+ of numbers.






No one is claiming that this sort algorithm (or any other) is asymptotically faster than O(n log(n)). It can be mathematically proved that no sort algorithm can improve on this. The object with sort algorithms is to find one with good constant-factor performance on typical inputs.

Frankly, I am more persuaded by the arguments in favor of an algorithm like "Tim-sort", which doesn't claim to micro-optimize hardware more efficiently (that's basically what "better swapping" is claiming), but instead claims that the algorithm is particularly efficient on some commonly-seen patterns (like "partially-sorted", or "pieces of the list are reverse-sorted"). Of course, any kind of argument about why one sort is better than another are inferior to actual benchmarks.


To be precise, we can prove that no sort comparison-based algorithm can do better than O(n log(n)) comparisons. There are some variations though that do better than that, though they are not based on comparisons (i.e counting sort, radix sort).

I'm quite a noob, and this is a bit off-topic, but if it's mathematically proven that no sorting algorithm can be faster then O(n*log(n)), but we know for sure that checking if an array is sorted is O(n), then doesn't that prove that P ≠ NP?

No it doesn't, since O(n*log(n)) is in P too. P = NP doesn't imply that the complexity of finding the solution has the exact same complexity than verifying it, just that both are polynomial.



Applications are open for YC Summer 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: