Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

log^2/3 might be the weirdest component I’ve ever seen in a complexity formula.




I'm continually amazed by the asymptotic complexity of union-find, which is O(alpha(n)), where alpha(x) is the inverse of the Ackermann function (and n the number of sets you union). In other words, O(6) or so as long as your inputs fit into the observable universe.

There's definitely a divide on who sees what sort of algorithms. The subject of this article is in Graph Theory space, which a lot of us get even without trying (I dabbled in TSP for a while because it's a difficult distributed programming problem and I wanted to explore the space for that reason).

But if you're not implementing AI or game engines, some of the linear algebra space may be a road less traveled.


I still think matrix multiplication's O(n^2.371339) is super weird.

Matrix multiplication definitely should be O(n^(2+o(1))).

for about a decade, integer multiplication was at n4^log*(n) where log* is the iterated logarithm.

Also the curently best factorization algorithm (GNFS) is at exp(k*log(n)^1/3log(log(n))^2/3).

Intro algorithms classes just tend to stay away from the really cursed runtimes since the normal ones are enough to traumatize the undergrads.


Are BigInteger multiplications in logn² now or do they still have weird terms in them?

down to nlogn

Hey the asterisks in your reply got read as formatting so it's ended up messed-up.

oops. fixed.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: