
Graph Isomorphism Algorithm Breaks 30-Year Impasse - kercker
http://www.quantamagazine.org/20151214-graph-isomorphism-algorithm/
======
gre
Graph Isomorphism in Quasipolynomial Time:
[http://arxiv.org/abs/1512.03547](http://arxiv.org/abs/1512.03547)

------
onetwotree
Look at this asshole, having a finite Erdos number and publishing solo ;-)

~~~
fapjacks
I think I understand the humor, but could you please explain so that I'm
certain?

~~~
ergothus
Your Erdos number is how many citation links you pass through to reach a paper
by Erdos.

This author has (apparently) a nice tidy small Erdos number. To publish
without co-authors denies those co-authors the chance to get (his Erdos
number+1).

So clearly he/she is selfish and has the wrong priorities :)

~~~
s-phi-nl
FWIW, Babai (the author) has a very small Erdos number: 1:

[http://www.ams.org/mathscinet-
getitem?mr=584517](http://www.ams.org/mathscinet-getitem?mr=584517)

~~~
deathanatos
> _Babai (the author) has a very small Erdos number_

Let's quit beating around the bush: he has _the smallest possible_ Erdős
number[1]. If you don't presently have an Erdős number, or wish to improve
your Erdős number for bragging rights, co-authoring a paper with him (or any
anyone with an equivalent Erdős number) is the best possible outcome.
(Obtaining an Erdős number of one is sadly no longer possible.)

[1]:
[https://en.wikipedia.org/wiki/Erd%C5%91s_number](https://en.wikipedia.org/wiki/Erd%C5%91s_number)

~~~
dietrichepp
Smallest possible for living humans. One person, notably, had a lower number,
but he is no longer with us.

~~~
hyperpallium
It's weird that you're downvoted. Just to clarify

> Paul Erdős has an Erdős number of zero.

------
nine_k
Previously posted and discussed extensively:

[https://hn.algolia.com/?query=Graph%20Isomorphism&sort=byPop...](https://hn.algolia.com/?query=Graph%20Isomorphism&sort=byPopularity&prefix&page=0&dateRange=all&type=story)

~~~
pervycreeper
This article is worth discussing in itself, as it is very clear, gives a good
sense of the situation, is easy to read and understand, and avoids the kinds
of inaccuracy and solecism usually seen in articles about math.

~~~
acallan
Indeed. And the big news is that the paper is available today.

[http://arxiv.org/abs/1512.03547v1](http://arxiv.org/abs/1512.03547v1)

------
dcohenp
> No one has ever found an efficient algorithm for an NP-complete problem, and
> most computer scientists believe no one ever will.

I find these kinds of assertions somewhat misleading. There _are_ "efficient"
algorithms for, one could say, "pragmatic variations" of NP-complete problems.
Some only work on some subset of cases (perhaps most of the useful ones), or
non-deterministic (but you run them enough times and get the right answer),
etc.

~~~
username223
This. MJD has a great bit on blanket statements about "NP-complete means
hard":

    
    
        http://perl.plover.com/yak/12views/samples/notes.html#sl-24
    

Basically, you can often either settle for slightly suboptimal solutions, or
ignore a few pathological worst cases. That said, I suspect that P != NP, and
thought this was a good article.

------
raverbashing
Ok so the question is, while theoretically this is an important work, is it
really worthy having a practical implementation?

(I remember something similar happening with "Primality testing is P" but the
existing, non-P algorithms were good enough so people wouldn't bother using it
except in specific situations)

~~~
kachnuv_ocasek
No. As the author notes in the preprint, there are already numerous very fast,
practical algorithms. This one is only asymptotically faster, and by itself it
would probably be slower in real-world applications.

~~~
jsprogrammer
If it is asymptotically faster, then it is implied that there are problems the
faster one can solve, that the other cannot, assuming finite time.

~~~
j2kun
The claims you're making in this subthread are not precise enough to be
correct. If you only allow a constant finite amount of time to solve a given
problem, then aysmptotics guarantee nothing. An O(1) time could solve fewer
instances than an O(2^2^n) time algorithm. It is also not true that the
asymptotically faster algorithm "can solve more instances as the allocated
time increases." These statements are only true if you add the phrase
"sufficiently large" to them. So asymptotically faster algorithms can solve
more instances as the allocated time bound increases, _provided that the time
bound is sufficiently large._ But how large that "sufficiently large" needs to
be is arbitrary, so you can't say that Babai's algorithm is practical without
either precisely studying the constants involved in the algorithm (which Babai
did to some extent, but omitted in the notation for clarity), or doing
empirical analysis, which is unlikely to ever happen with Babai's algorithm.

~~~
jsprogrammer
The statement is correct.

"Sufficiently large" is implied and also not strictly necessary for the
statement to be correct.

~~~
j2kun
If it's not required then it's not implied, and unfortunately it is required.
If you want to keep having serious discussions about this stuff I suggest you
go read some more about algorithms.

~~~
jsprogrammer
You should be able to simply state the error in my statement then. Anyway,
others disagree with you.

------
LesZedCB
Is somebody able to explain in more detail what the "Greater metropolitan
area" of P space means? I have a CS undergrad degree so I get what polynomial
space is, just not sure about the quasi-p-space part.

~~~
apetresc
Quasi-polynomial time is slower than polynomial time, but (much) faster than
exponential time. Basically, if polynomial is O(n^c) and exponential is
O(c^n), then quasi-polynomial is O(x^(log(n)^c); that is, n _does_ show up in
the exponent, but it's some power of the log of n.

In theoretical terms, it's much closer to a polynomial problem than an
exponential one.

~~~
LesZedCB
Gotcha, that makes perfect sense, thanks for the explanation!

------
superuser2
I very nearly took Algorithms from this guy. Toughest professor (and grader)
in the CS program, by far. He's quite a character, though.

------
Devid2014
What happens with this one ?

Polynomial Time Algorithm for Graph Isomorphism Testing.
[http://mt2.comtv.ru/](http://mt2.comtv.ru/)

Is this a hoax or has some substance ?

~~~
sold
Earlier version was wrong
[https://cstheory.stackexchange.com/questions/1064/polynomial...](https://cstheory.stackexchange.com/questions/1064/polynomial-
time-algorithm-for-graph-isomorphism-testing), I doubt the current version has
any substance.

------
Zach_the_Lizard
How long until we start seeing this algorithm in Google interviews?

~~~
Ar-Curunir
Very long, most CS people don't know any group theory.

~~~
wtallis
Just a week or two ago I saw someone get downvoted for suggesting that many CS
degrees don't include real analysis. I suspect that by now undergrad CS
programs are all over the place, and I wouldn't dare make broad
generalizations about the math content of a CS degree anymore.

~~~
j2kun
I would, because I'm on the inside. There are even many CS degrees that don't
require a course in algorithms. You have to remember that most CS degrees come
from public/state universities, not Stanford and MIT. And CS students at
public/state universities are largely mathphobic.

~~~
pm90
That's a very broad generalization. There are many public/state universities
with amazing CS research and programs (e.g. UC Berkeley) that are very math-
heavy.

~~~
Ar-Curunir
As a student at UC Berkeley, students need to study at most linear algebra and
multivariablr calc to major in CS/EECS.

~~~
jsprogrammer
No Boolean algebra, probability, or statistics?

------
jgn
Thanks for posting this. In the future, could you please add the algorithm to
the title? This feels a couple steps removed from clickbait. No offense
intended; it's just a suggestion.

~~~
tragic
> Babai has declined to speak to the press, writing in an email: “The
> integrity of science requires that new results be subjected to thorough
> review by expert colleagues before the results are publicized in the media.”

Quoth the media ... Apparently unironically.

~~~
grrowl
I'm okay with Quanta Magazine leaving out the irony; I can imagine the
Engadgets and Buzzfeeds of the world would omit his response and draw their
own wild conclusions regardless.

