
Graph Neural Networks – An Overview - sergioskar
https://theaisummer.com/Graph_Neural_Networks/
======
asterisk_
The article seems to be a bit light on details for an "overview" of GNNs.

It's an area I've recently been researching and they do seem to be gaining a
significant amount of traction. If anyone is interested in additional reading
material, I can suggest the very recent GNNs: Models and Applications (slide
deck available on the website) [0].

There is also a fairly comprehensive GitHub repo on [1], though I personally
haven't given it a detailed look yet.

[0]
[http://cse.msu.edu/~mayao4/tutorials/aaai2020/](http://cse.msu.edu/~mayao4/tutorials/aaai2020/)

[1] [https://github.com/benedekrozemberczki/awesome-graph-
classif...](https://github.com/benedekrozemberczki/awesome-graph-
classification)

~~~
sergioskar
Yeah, you probably right. I am always puzzled about how much math to include
and how deep to go on an introductory article. Thanks for your feedback

------
nurettin
Has anyone else noticed the connected layers in neural networks and then
wondered if a more generalized topology such as a directed graph could be
applied to neural networks the first time they were introduced to the concept
some decades ago and then realized that they must not have been the first one
to notice and therefore the layered topology must have some mathematical
superiority over a generalized form but never found a concrete answer as to
why?

~~~
dnautics
> therefore the layered topology must have some mathematical superiority

Isn't it just that backpropagation on the layered topology is relatively
straightforward?

That's not to say you can't write a backpropagation on an arbitrary digraph,
but as you get to more and more complex digraphs, things will get harder.

I could be wrong on this.

~~~
dodobirdlord
> Isn't it just that backpropagation on the layered topology is relatively
> straightforward? That's not to say you can't write a backpropagation on an
> arbitrary digraph...

Moreover, any arbitrary digraph can be expressed as a layered topology
(possibly with a lot of 0-weights). Since there's no fundamental difference
you might as well work with whatever's easiest to compute with.

------
crashocaster
It should be noted that the described graph embedding related tasks are only a
small subset of the tasks that GNNs solve. Many (if not most) graph learning
techniques focus on more "local" tasks like node classification or edge
prediction.

------
The_rationalist
At which NLP tasks are they state the art? The only one where they are really
competitive is dependency parsing. (from my own disjonction of cases)

Also were there any new real SOTA on any NLP tasks since last summer? I feel
like accuracy progress has frozen..

What I would love would be to get a notification/mail when a task from which I
subscribed got a new SOTA (from paperswithcode.com obviously).

~~~
tastroder
> At which NLP tasks are they state the art? The only one where they are
> really competitive is dependency parsing. (from my own disjonction of cases)

At tasks that actually involve graphs presumably.
[https://arxiv.org/pdf/1901.00596.pdf](https://arxiv.org/pdf/1901.00596.pdf)
[https://paperswithcode.com/task/graph-
classification](https://paperswithcode.com/task/graph-classification) has a
bunch of GNNs ranked #1

> Also were there any new real SOTA on any NLP tasks since last summer? I feel
> like accuracy progress has frozen..

That's pretty normal for winter/spring, it's not conference season.

~~~
The_rationalist
_it 's not conference season_ Weird ^^ Imagine that I'm a scientist and that I
made a big discovery X during winter. But for audience/visibility I only want
to publish my results on conference Y during summer. Somebody, right before
summer make the same discovery than mine. How do I protect the fact that I am
the first one discoverer if I publish after the second?

~~~
p1esk
You post it on arxiv

~~~
The_rationalist
So the answer would be the SOTA results are on arxiv but are posted on
paperswithcode.com leaderboards only at conference time? Sounds unlikely.

------
naresh_xai
Graph Neural networks are currently used a lot in the neural networks for drug
discovery space. They significantly beat RNN and CNN baseline/complex versions
equivalents on the same datasets(Tox21, QM9 efc).

~~~
RocketSyntax
Can you please share an example? I am in genomics and looking to get closer to
the drug chemistry.

~~~
naresh_xai
Happy to share decks and references to your email address. Please share your
email address with me :)

------
mehh
Watch the video, its much clearer
[https://www.youtube.com/watch?v=cWIeTMklzNg](https://www.youtube.com/watch?v=cWIeTMklzNg)

------
syntaxing
Is there any tutorial using graph NN for biomed? I always wanted to learn more
how graph NN is applied on the health/medical industry.

~~~
ampdepolymerase
Look into RDFs and Ontology w.r.t. NNs. A very rich (though unproductive)
history.

