Graph Neural Networks - tutorials and resources

Hello,

@Conwyn Thanks for inspiring me to make this topic.

I would like to share some valuable resources about Graph Neural Nets. So let’s go:

  1. “A Gentle Introduction to Graph Neural Networks”

A fully free article that explains with simple examples, interactive simulations what graph-based networks are and
their practical applications.

https://distill.pub/2021/gnn-intro/

  1. Pytorch’s Geometrics: “Colab Notebooks and Video Tutorials”

Let me clarify in advance that the Pytorch is not the only framework to choose from. However, in terms of content, it is a very good collection of courses from MIT, Stanford and lessons developed by the “PyTorch Geometric Tutorial Project” itself.
I especially recommend it for people who want to get straight to the substance right away, without having to set up the SDK.

https://pytorch-geometric.readthedocs.io/en/latest/get_started/colabs.html

  1. Hardware and distributed usage.

One of the most interesting aspects of Pytorch’s Geometrics is that there are a number of high-level APIs to compile tensor neural networks into ‘cubes’ that readily support architectures dedicated to GNNs. I’m thinking of training distributed on Intelligence Processing Units - IPUs, as well as using several GPUs simultaneously. For TF fans, there are also APIs emulating training or inference for XLA based frameworks.

https://docs.graphcore.ai/projects/ipu-programmers-guide/en/latest/ipu_introduction.html

How-to videos with links to code examples from the Graphcore team:

  1. Also at paperspace.com are available ready to go images with tutorials, how looks like inference with classical models compiled into graph. IPU-POD size 4 is freely available,
    but compiling of the graph takes about 15 min - take a break for a tea or a coffee or read something or take a walk instead of waiting.

paperspace.com

Regards,

Mike

2 Likes

Hi Mike

I was thinking (which is often dangerous) but in NLP we have attention so there is a relation between each source target combination and the strongest is the best word. J’ai un chat - I have a cat - I habe eine Katze - Mae cath gyfa fi . Graphs have nodes with attributes and edges with strengths. So my crazy thought was whether GNN could be used in NLP.

Regards Conwyn

Hi Conwyn,

I feel responsible for exposing you to the factor that triggered a group of neurons responsible for your peculiar thoughts :wink: I don’t want you to stray too far, so I’ll share my intuition :wink: Karpathy can’t help but laugh at the idea that NLP models are just boring text-generating machines without the capability for backward reflection. This might be true in the case of GPTx. However, BERT models stand out because they recognize context from both the right and left sides, as they are trained on the entire length of the sentence. They also apply random masking to a given token (word) throughout the sentences. BERT’s task is to predict the most likely word to be found at the masked index. This approach is what I most associate with the concept of graphs. Of course, it would require a dataset in the form of a graph and a customized implementation of this model to talk about GNN!

Ref:

  1. BERT paper: Understanding BERT - arXiv
  2. Visualization in the form of PDF slides: BERT Slides - GitHub
  3. A video explaining the differences between the GPTx series and BERT: GPTx vs. BERT - YouTube

Regards

Mike

Hi Mike

The gentlemen from Google are busy

https://blog.research.google/2024/02/graph-neural-networks-in-tensorflow.html?m=1

Regards Conwyn

1 Like

Hi Conwyn,

@Conwyn

Thanks 4 sharing this blog. Yes, I suppose that the gentlemen from Google are 24h on due to the Gemini introduction. Everything might be a graph. The Mixture of Experts (MoE) in NLP tasks - (for ex. LangChain, Autogen). I found an interesting blog by someone from Hugging Face about retrieval search methods.

Screenshot 2024-02-23 at 18-20-47 Long_Form_Question_Answering_with_ELI5_and_Wikipedia

Here’s a link: https://yjernite.github.io/lfqa.html

Have a nice day/ wonderful night

Mike

EDIT: Some about LLMs generation method, more enjoying reading with code examples in the Colab.

A review of the generation methods:

https://huggingface.co/blog/how-to-generate

About generating with contrastive search:

https://huggingface.co/blog/introducing-csearch#generating-human-level-text-with-contrastive-search-in-transformers-%F0%9F%A4%97