
Innovations in Graph Representation Learning - headalgorithm
https://ai.googleblog.com/2019/06/innovations-in-graph-representation.html
======
TelmoMenezes
This is very interesting work. Regarding:

"However, graphs are inherently combinatorial structures made of discrete
parts like nodes and edges, while many common ML methods, like neural
networks, favor continuous structures, in particular vector representations."

I apologize in advance for a bit of self-promotion, but I would like to point
out my own approach, which instead favors discrete ML methods to discover
symbolic generators of networks. That is to say, small programs that are
capable of generating synthetic networks with similar topological and other
characteristics to some empirically observed one. If you happen to be
interested:

[https://www.nature.com/articles/srep06284](https://www.nature.com/articles/srep06284)

[http://www.telmomenezes.net/2014/09/using-evolutionary-
compu...](http://www.telmomenezes.net/2014/09/using-evolutionary-computation-
to-explain-network-growth/)

I am not saying that my approach is better, it depends on the goal. On the
contrary, I am increasingly a believer in hybrid (symbolic-statistic) methods.

------
lmeyerov
Focusing on multiple labels (overlapping communities) is great! It can also be
similar to changing the problem to edge labeling, instead of node labeling.

The way we handle the common cases of entity overlap like in event data is
another simple way: hypergraph modeling. In terms of reusing graph tech, that
just means you make a bipartite graph between samples/events and features.
That is one of the most common data transforms folks toggle on/off with
Graphistry visuals.

I'm guessing still another couple of years before a notion of standard
practice emerges for graph learning. Very cool time!

------
pagutierrezn
Why this site can't be seen in Firefox Focus is a mistery to me

