Live coding 10

This topic is for discussion of the 10th live coding session

<<< session 9session 11 >>>

Links from the walk-thru

Video timeline - thank you @Mattr

00:00 - Questions
06:00 - Steps for Entering a Standard Image Recognition Competition on Kaggle
08:40 - The best models for fine-tuning image recognition
12:00 - Thomas Capelle script to run experiments
14:00 - Github Gist
16:00 - Weights and Biases API
17:00 - Automating Gist generation
20:30 - Summarising and ranking models for fine-tuning
23:00 - Scatter plot of performance by model family
25:40 - Best models for images that don’t look like Imagenet
33:00 - Pretrained models - Model Zoo, Papers With Code, Huggingface
37:30 - Applying learning on Paddy notebook with small models
46:00 - Applying learning on large models
47:00 - Gradient accumulation to prevent out of memory
52:50 - Majority vote

17 Likes

This was a great talk! I can’t believe we’re getting all this for free :smiley:

13 Likes

I concur. So valuable and greatly appreciated. These walkthrus have been nothing but amazing (the last Kaggle notebook was just mindblowing and the fact that you are able to simplify things for us is epic). Thank you!!!

3 Likes

The stuff we did today worked out well too!

15 Likes

These walk thrus are absolutely amazing, thanks so much Jeremy and everyone participating to the zoom calls :heart: !!

3 Likes

can one use item transforms in a parallel way to resize images before starting training neural net? on kaggle it seems that resizing is a slowing down training.

Update:
When I remove batch_tfms from the train function. It works well. I guess the problem is about the augmentation part.
In the last couple of walkthru’s I’m having this problem, any idea about the reason? The interesting part is when I run it again (last line) it works.
Thanks

Wow did you ensemble with average of all the methods?

1 Like

Jeremy I found your notebook on how you made the gistfile super useful. Can you make that also publicly available?

2 Likes

In this walkthrough you overcome memory issues by decreasing the batch size and updating alternate batches. I understand that it is mathematically identical to larger batches. But why not just decrease the batch size and update every (smaller) batch?

2 Likes

Jeremy, you mentioned a few times the model would learn the variety on its own, given the amount of data, so you didn’t use this feature at all. How would one add such a feature to the training process, given that we’re using a vision model but we also have tabular data? Can we combine models?

Video timestamps for Walkthru 10

In the kaggle notebook the Concat pooling type was recommended but I’m not sure if our how to implement in our Paddy notebook. What does it mean to pool layers?

00:00 - Questions
06:00 - Steps for Entering a Standard Image Recognition Competition on Kaggle
08:40 - The best models for fine tuning image recognition
12:00 - Thomas Capelle script to run experiments
14:00 - Github Gist
16:00 - Weights and Biases API
17:00 - Automating Gist generation
20:30 - Summarising and ranking models for fine tuning
23:00 - Scatter plot of performance by model family
25:40 - Best models for images that don’t look like Imagenet
33:00 - Pretrained models - Model Zoo, Papers With Code, Huggingface
37:30 - Applying learning on Paddy notebook with small models
46:00 - Applying learning on large models
47:00 - Gradient accumulation to prevent out of memory
52:50 - Majority vote

8 Likes

I’ll just post a screenshot from the video (ts: 19:00) for future reference:

3 Likes

It’s actually the default in fastai :slight_smile:

3 Likes

These timestamps you’re doing are really helpful @Mattr ! Thanks so much :slight_smile:

4 Likes

Jeremy made it public.

1 Like

I forgot to ask, but I wonder what Jeremy does for nested tmux sessions?

I now layer my sessions, so on my laptop in wsl I have a different key binding (ctrl-f), on my server I use the standard one (ctrl-b) and if I run docker on my server and want to have a tmux there, I use screen.

I wonder if there is a better approach though :slight_smile:

1 Like

I gave up trying to nest tmux sessions. I just confused myself too much…

4 Likes

This is just very minor but quite annoying thing…

I wonder if there would be a way to filter out these messages somehow?

Whenever I have notebooks still open and I shutdown the server, I get these messages, and they make moving around in terminal quite hard.

I saw these things pop up with other things as well (issuing a shutdown command for instance) where text gets forced onto the terminal, I wonder what this functionality is :thinking:

1 Like

Someone recommend switching off prefix in the nested tmux session for this line 72

1 Like