00:00 - Questions 06:00 - Steps for Entering a Standard Image Recognition Competition on Kaggle 08:40 - The best models for fine-tuning image recognition 12:00 - Thomas Capelle script to run experiments 14:00 - Github Gist 16:00 - Weights and Biases API 17:00 - Automating Gist generation 20:30 - Summarising and ranking models for fine-tuning 23:00 - Scatter plot of performance by model family 25:40 - Best models for images that don’t look like Imagenet 33:00 - Pretrained models - Model Zoo, Papers With Code, Huggingface 37:30 - Applying learning on Paddy notebook with small models 46:00 - Applying learning on large models 47:00 - Gradient accumulation to prevent out of memory 52:50 - Majority vote
I concur. So valuable and greatly appreciated. These walkthrus have been nothing but amazing (the last Kaggle notebook was just mindblowing and the fact that you are able to simplify things for us is epic). Thank you!!!
can one use item transforms in a parallel way to resize images before starting training neural net? on kaggle it seems that resizing is a slowing down training.
Update:
When I remove batch_tfms from the train function. It works well. I guess the problem is about the augmentation part.
In the last couple of walkthru’s I’m having this problem, any idea about the reason? The interesting part is when I run it again (last line) it works.
Thanks
In this walkthrough you overcome memory issues by decreasing the batch size and updating alternate batches. I understand that it is mathematically identical to larger batches. But why not just decrease the batch size and update every (smaller) batch?
Jeremy, you mentioned a few times the model would learn the variety on its own, given the amount of data, so you didn’t use this feature at all. How would one add such a feature to the training process, given that we’re using a vision model but we also have tabular data? Can we combine models?
In the kaggle notebook the Concat pooling type was recommended but I’m not sure if our how to implement in our Paddy notebook. What does it mean to pool layers?
00:00 - Questions
06:00 - Steps for Entering a Standard Image Recognition Competition on Kaggle
08:40 - The best models for fine tuning image recognition
12:00 - Thomas Capelle script to run experiments
14:00 - Github Gist
16:00 - Weights and Biases API
17:00 - Automating Gist generation
20:30 - Summarising and ranking models for fine tuning
23:00 - Scatter plot of performance by model family
25:40 - Best models for images that don’t look like Imagenet
33:00 - Pretrained models - Model Zoo, Papers With Code, Huggingface
37:30 - Applying learning on Paddy notebook with small models
46:00 - Applying learning on large models
47:00 - Gradient accumulation to prevent out of memory
52:50 - Majority vote
I forgot to ask, but I wonder what Jeremy does for nested tmux sessions?
I now layer my sessions, so on my laptop in wsl I have a different key binding (ctrl-f), on my server I use the standard one (ctrl-b) and if I run docker on my server and want to have a tmux there, I use screen.
Whenever I have notebooks still open and I shutdown the server, I get these messages, and they make moving around in terminal quite hard.
I saw these things pop up with other things as well (issuing a shutdown command for instance) where text gets forced onto the terminal, I wonder what this functionality is