A large number of scientific problems involve dealing with graphs in one way or another. I’ve skimmed through tons of papers of recent deep graph architectures for ingesting, generating, or solving problems on graphs (see the recent graph coloring paper: https://arxiv.org/pdf/1902.10162v2.pdf) but I don’t yet have a good sense of what are the useful primitive structures for solving new problems on graphs and what are useful benchmarks to look out for in the arxiv papers. Anyone knows how to tell apart the generalizability to other problems of one graphCNN from a graphRNN? What are the most promising architectures, papers, and benchmarks for learning and for generating graphs?

Hi, neural networks on graphs is an active research field and right now there are only different proposals. No architecture is widely accepted as winner: I mean each arch has its own “pros and cons”. So the task at hand will steer you towards the more appropriate approach among:

- convolutional networks,
- attention networks,
- autoencoders,
- generative networks.

My advise to you is simple just start from this great pytorch library : FAST GRAPH REPRESENTATION LEARNING WITH PYTORCH GEOMETRIC

Let me know what do you think about

I’m trying to find out the way into this field of research now. I’m not aware of a lot of papers, but I recommend to start with Semi-Supervised Classification with Graph Convolutional Networks. IMO it’s a good start point.

Here is the abstract of this paper:

We present a scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs. We motivate the choice of our convolutional architecture via a localized first-order approximation of spectral graph convolutions. Our model scales linearly in the number of graph edges and learns hidden layer representations that encode both local graph structure and features of nodes. In a number of experiments on citation networks and on a knowledge graph dataset we demonstrate that our approach outperforms related methods by a significant margin.

@fabris and @ademyanchuk, thanks for the suggestions! I plan to but haven’t yet played with the code in the paper that Fabrizio suggested—I had first noticed their github page a while ago when I was looking for examples of pytorch’s pdist. Yesterday I got into this rather comprehensive review paper that gives a decent background and discusses high level ideas that make some sense to me; it helps eliminate some bad ideas, but it still doesn’t seem to get to the nitty gritty details that would help me pick the right solutions on practical problems: https://arxiv.org/abs/1806.01261

You’re right, the paper is a good one but it introduces a generalized framework at a very abstract level. If you have any problem in mind, let me know! I’ll try to help you.

I’m also interested in this area, but haven’t looked at it too closely.

Some more hands-on style resources I’ve found include notes fro a workshop held at the 2018 International Society for Computational Biology conference last year: http://snap.stanford.edu/deepnetbio-ismb/index.html

It covers a fair bit of historical ground right up to ongoing research from the presenters (Marinka Zitnik and Jure Leskovec) & others.

Here’s another presentation from Feb this year also from Jure Leskovec: http://i.stanford.edu/~jure/pub/talks2/graphsage_gin-ita-feb19.pdf

Personally I found the first batch of material impenetrably dense when I first looked at it last June, I might have another pass at it now with my brain more “in gear”

Here are Jure’s slides from the ICLR 2019 talk he just gave. I haven’t even read them yet, I just saw his tweet

Of interest - but no public access yet…

We've just had the first AI research paper submitted to a major journal by a SharpestMinds student.

— Edouard Harris (@neutronsNeurons) May 3, 2019

Groundbreaking result in graph neural nets. All from work during his mentorship.

Any recommendation for a beginner? I cannot get the intuitive of the “Graph Convolution”, especially about the various form of Laplacian Operator. Any pointers will be appreciated, thank you!

For starters, read this post from the inventor of Graph Convolutional Networks. Then start digging into Kipf’s paper introducing Graph Convolutional Networks.

Thanks for the quick reply, I am reading the post just now, but I cannot get the math intuitively what they are doing.

Ah ok I forgot about this blog post:

This is a useful two-part series that is really helpful to better understand the intuition behind GNNs and GCNs. These two blog posts, along with Kipf’s blog post, and Kipf’s paper, should be enough to get a good grasp for how GCNs work.

For anyone interested in graph based networks, here’s two Pytorch based libraries.

Pytorch Geometric:

Deep Graph Library:

I think Pytorch Geometric is easier to use out of the box, and has implementations of most modern graph network layers.

DGL more complex but covers a wider range of use cases, such as capsule nets, graph transformers, and tree LSTMs, as well as features for working with massive graphs.

Can anybody share a notebook implementation of pytorch Graph Convolutional Networks. Thanks.