Live coding 8

Don’t quote me on it as I don’t know what Kaggle exactly does, but that’s the only way(exact same split of public/private test data) in my mind to achieve comparable rankings for a leaderboard.

1 Like

Yes, but it’s not necessarily random. They might pick out some particularly difficult subset, for instance. It’s the same for everyone.

2 Likes

got it, its a varying degree of random…
image

I’m thinking those four rotated dimension photos might be likely candidates for a more difficult subset.

(480, 640)    10403
(640, 480)        4
dtype: int64

fit_flat_cos learning rate scheduler is my default now that I have an experience that it works much better than fit_one_cycle. However, fine_tune is still using fit_one_cycle so I’m wondering if it will change in the future or if there are some scenarios that fit_one_cycle is better? Thanks

1 Like

1cycle is often better.

1 Like

Thanks Jeremy. 1cycle is better for finetuning or all cases in general? Because I see the leader board of Imagenette use fit_flat_cos. Or maybe It’s because of the optimizer ranger rather than the scheduler?

Ranger already has a warmup, so should never use 1cycle.

2 Likes

Man I just discovered that docker has a Python API:

1 Like

@suvash
I had the same issue while I tried to run the notebook from Colab. I assumed that timm was already installed.

I ran the notebook until vision_learner’s cell error came out. After that I manually installed timm but it didn’t worked until I restarted the kernel.

I checked it and setup_comp(install) is not working properly in Colab.

I’ve been working through the Live Coding exercises recently. These are great!

Having trouble running learn = vision_learner(dls, ‘convnext_small_in22k’) at the 30 minute mark in the Live Coding 8 video. Noticed that timm.list_models(‘convnext*’) shows a different list of models now, than was shown in the video, but having the same issue. I first got an error about needing to pip install huggingface_hub, so I did that and imported it as well as timm. Restarted kernel and keep getting an error related to hugging face that prevents the model from downloading. I did get an error a few attempts ago that indicate the weights couldn’t be downloaded. Something seems to have changed with timm since 1 year ago. Any tips?

I was able to use convnext_small_in22k even though the timm.list_models('convnext*') output did not explicitly show it. I fixed the timm version as follows, which may resolve your pip install error:

# install fastkaggle if not available
try: import fastkaggle
except ModuleNotFoundError:
    !pip install -Uq fastkaggle

from fastkaggle import *

comp = 'paddy-disease-classification'
path = setup_comp(comp, install='fastai "timm==0.6.2.dev0"')


1 Like

Thank you. I found out the problem only occurs on paperspace. I have no problem running the code and downloading the timm models on my notebook pc or on kaggle. At this point, not sure what the cause of the problem is on paperspace. The error says it has to do with huggingface hub.