I expected that this would freeze the body, and only update the weights for the head.
In reality, when I manually inspected the weights of the body after training, they were different! Why did they change? Does learn.fit_one_cycle automatically unfreeze the model after one epoch or something?
As far as I know, that shouldn’t be the case. You need to call learn.unfreeze() explicitly. The newer method fine_tune will do unfreezing for you, but that is not the case with fit_one_cycle.
How are you inspecting the weights of the body? Are all or only some changing?
Along with what @Pablo said, you need to explicitly pass in a splitter yourself. So to your learner also do splitter=default_split, and do learn.freeze() immediately after creation
I have created a very short colab script that reproduces the error: Google Colaboratory
Let me know if you can find where the bug is! It really does seem to me that fit_one_cycle is modifying parameters in the body even though we called freeze (and thus expect only the parameters in the head to be modified)
I think @muellerzr has already provided the solution just do: learn = Learner(dls, model, loss_func=CrossEntropyLossFlat(), metrics=accuracy, splitter=default_split)
then learn.freeze()
I have tried it in your provided notebook and it works as expected.