Notebook 05_EfficientNet_and_Custom_Weights.ipynb fails on google colab on line
learn.summary()
with error
RuntimeError: running_mean should contain 3072 elements not 6144
Notebook 05_EfficientNet_and_Custom_Weights.ipynb fails on google colab on line
learn.summary()
with error
RuntimeError: running_mean should contain 3072 elements not 6144
I think if you have the latest versions of timm, fastai, and walkwithfastai this shouldnāt be a problem.
Actually this is an active bug I think. This is different than the timm notebook (itās on my things to tackle next week)
Hi @muellerzr ,
I think fastai has updated the create_head code. We donāt have to explicitly multiply nf with 2 when concat is True, create_head already does it.
So in create_timm_model function we donāt need to do this ā nf = num_features_model(nn.Sequential(*body.children())) * (2 if concat_pool else 1)
Because of this nf is multiplying 4 times instead of 2.
You are right! The timm tutorial has it the right way, but the efficientnet one does not! Will update today
Timm tutorial: Utilizing the `timm` Library Inside of `fastai` (Intermediate) | walkwithfastai
Thanks for the effort! Unfortunately, it still encounters problems. Iāve opened an issue on GitHub. Basically Google Colab defaults to newer library versions. In particular fastai now masks the internal _update_first_layer
function. If I add the old definition the notebook runs.
Thanks @ne1s0n! Iāve added in the import. Not sure if this was a miss on my part or what, but seems I forgot an import along the way!
For those wanting the direct, its: from fastai.vision.learner import _update_first_layer
Has anyone faced mismatched keys with the state_dict? Timm model state doct and fastai trained timm model have slightly different keys in their respective state_dicts. Iāve tested by retraining using current versions and am not sure if there is a simple workaround to this problem other than renaming the keys to match. All other model information appears to match perfectly. Thanks for any help
Dan