It's O(k * n) where k is the number of digits. It takes log-b(N) digits to represent N distinct integers in base-b, so O(k * N) reduces to O(N log N) when there's no bound on the range of keys.It's still pretty useful for sorting data where you know the keys are small integers (say, less than a machine word).

 Thank you!So if you put a bound on the size of the keys, let's say 32bit, it becomes linear? Obviously it would be cheating to put a giant number here :)
 Right. Basically, if the key size is bounded, then k becomes a constant and it reduces to O(N).
 Actually, it's useful when 'N' (the number of elements) is much larger than 'k', the number of significant digits in the largest number in the set.If k ∈ Θ(log N), then O(Nk) becomes O(N log N), which is asymptotically no better than any optimal comparison based sort.However, if k ∈ o(log N), then we get an asymptotically better algorithm.
 Quicksort is O(k*n log n) though since it does O(n log n) comparisons and a comparison is going to be O(k), so radix sort is still asymptotically faster.
 See response to Retric:https://news.ycombinator.com/item?id=5656534A comparison isn't always O(k). It's O(k) for strings. It's O(1) for machine-word integers. It can be much more than O(k) for user-defined types, eg. it can be O(k^3) if you have to multiply a matrix out and take the determinant of the product.
 For word sized integers, k=O(1) so radix sort is O(n) and quicksort is O(n log n). If you have a comparison function that takes O(k^3) then you most likely can't even implement a radix sort for it, so the comparison can't be made.So the fact is that quicksort is a factor O(log n) slower than radix sort. The flip side is that quicksort is more generally applicable to any comparison function whereas radix sort only works for lexicographic sorts. In almost all cases that is exactly what you want, but for example I'm not aware of any way to efficiently sort rational numbers with radix sort.

Search: