NeurIPS 2020 for fastai students

NeurIPS is virtual and much more accessible this year. Can you share any tips on how to get the most out of it? Are there any fastai students that plan to attend this year?

3 Likes

I plan to be there! :slightly_smiling_face:

1 Like

I am also there! :slight_smile:

1 Like

I am organizing the NeurIPS remote in Perth. We have few organizers registered and start watching the recordings. We are planning to watch the videos together with a focus on medical/health then have a short discussion on application afterwards.

2 Likes

Hope to see some good discussion here about interesting topics! There’s also a bunch of videos here I believe https://crossminds.ai/category/neurips%202020/

2 Likes

Some highlights from me so far - please share your findings in the thread as well!

  1. Keynote by Charles Isbell about some of the biases and risks associated with ML models. I loved the format (very engaging!) and the fact that one of the recommendations, making ML more accessible to experts, is also a key contribution of fastai.
  2. SimpleTOD: A Simple Language Model For Task Oriented Dialogue (https://blog.einstein.ai/simpletod/) - it’s an example how the language model approach can be expanded to a bigger, end-to-end task of task oriented dialog.
3 Likes

This talk mentions PQ learning at the end which looks interesting to look into https://nips.cc/virtual/2020/public/invited_16163.html. Also talks about privacy and verification.

1 Like

Here’s a link to an excellent tutorial on ML explainability: https://explainml-tutorial.github.io/neurips20

After participating in the Kaggle OpenVaccine challenge I’ve gotten more interested in graphs. Here are some papers from NeurIPS on this topic that I found interesting:

  1. Open Graph Benchmark: Datasets for Machine Learning on Graphs
    A collection of graph datasets from various domains and a platform for loading and evaluating ML models on those datasets, including a leaderboard.
    https://ogb.stanford.edu/
    https://proceedings.neurips.cc/paper/2020/file/c5c3d4fe6b2cc463c7d7ecba17cc9de7-Paper.pdf
  2. Design Space for Graph Neural Networks
    A study of different GNN tasks and architectures, with recommendations on how to find the right architecture for a given task. Includes some general architecture recommendations (batch norm, dropout etc.) and sensible defaults. Also provides a code platform for GNN research (GraphGym).
    http://snap.stanford.edu/gnn-design/
    https://proceedings.neurips.cc/paper/2020/file/c5c3d4fe6b2cc463c7d7ecba17cc9de7-Paper.pdf
  3. Learning Physical Graph Representations from Visual Scenes
    Using graph representation of a scene as inductive bias for learning scene representation. Shows how PSGNet architecture can help with learning tasks like semantic segmentation.
    https://proceedings.neurips.cc/paper/2020/file/4324e8d0d37b110ee1a4f1633ac52df5-Paper.pdf
  4. Graph Contrastive Learning with Augmentations
    Applying self-supervision and augmentations to graphs
    https://proceedings.neurips.cc/paper/2020/file/3fe230348e9a12c13120749e3f9fa4cd-Paper.pdf
  5. Self-Supervised Graph Transformer on Large-Scale Molecular Data
    Another take on self-supervision for graphs
    https://proceedings.neurips.cc/paper/2020/file/94aef38441efa3380a3bed3faf1f9d5d-Paper.pdf
3 Likes

Final list of picks from me. All the content should be made public in a few weeks, so that gives an opportunity for everyone to access the materials. Some of them are already available.

  1. Practical limitations of today’s deep learning in healthcare, by Andrew Ng - lots of practical advice from Andrew Ng, applicable also beyond healthcare.
    https://slideslive.com/38938453/practical-limitations-of-todays-deep-learning-in-healthcare
  2. Applying Graph Neural Networks to Molecular Design
    https://slideslive.com/38938184/applying-graph-neural-networks-to-molecular-design
  3. Real-world application of ML in drug discovery
    Interesting perspective on data science within a pharma company
    https://slideslive.com/38938181/realworld-application-of-ml-in-drug-discovery
  4. Workshop: Self-Supervised Learning – Theory and Practice
    Lots of interesting content here on self-supervised learning
    https://sslneuips20.github.io/pages/schedule.html
  5. AI-assisted data labeling demo
    Using active learning to develop a labeled dataset example
    https://neurips-assistance.mybluemix.net/
5 Likes

I keep finding interesting content! The last list was supposed to be final, but well… :slight_smile:

  1. Uncertainty-aware Self-training for Few-shot Text Classification
    Few-shot learning is a hot thing now, this is an interesting approach by Microsoft Research. On the surface appears similar to pseudo-labeling from Kaggle, but need to dig deeper to understand the nuance.
    https://proceedings.neurips.cc/paper/2020/file/f23d125da1e29e34c552f448610ff25f-Paper.pdf
  2. MiniLM: Deep Self-Attention Distillation for Task-Agnostic Compression of Pre-Trained Transformers
    Again Microsoft Research, with their approach for language model distillation. I think I heard good feedback in the industry on their MiniLM.
    https://proceedings.neurips.cc/paper/2020/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf
  3. VIME: Extending the Success of Self- and Semi-supervised Learning to Tabular Domain
    Self-supervised learning for tabular data! That is definitely an exciting application area. It looks similar to autoencoders for tabular, again I need to dig deeper to understand the differences.
    https://proceedings.neurips.cc/paper/2020/file/7d97667a3e056acab9aaf653807b4a03-Paper.pdf
  4. DISK: Learning local features with policy gradient
    In a recent Kaggle metric learning competition, local features didn’t perform well at all, but I have a project where I may need to apply them, so trying to learn about the best approaches.
    https://proceedings.neurips.cc/paper/2020/file/a42a596fc71e17828440030074d15e74-Paper.pdf
  5. A Discrete Variational Recurrent Topic Model without the Reparametrization Trick
    I’m also trying to learn more about topic modeling for another project, so it’s good to see that there are new papers in this area :slight_smile:
    https://proceedings.neurips.cc/paper/2020/file/9f1d5659d5880fb427f6e04ae500fc25-Paper.pdf
3 Likes