Does learn.fit_one_cycle automatically unfreeze the model?

I have code like this:

model = nn.Sequential(
    create_body(resnet101, cut=-2),
    create_head(2048, 50, lin_ftrs=[512]),

learn = Learner(dls, model, ...)
learn.fit_one_cycle(10, 1e-4)

I expected that this would freeze the body, and only update the weights for the head.

In reality, when I manually inspected the weights of the body after training, they were different! Why did they change? Does learn.fit_one_cycle automatically unfreeze the model after one epoch or something?

As far as I know, that shouldn’t be the case. You need to call learn.unfreeze() explicitly. The newer method fine_tune will do unfreezing for you, but that is not the case with fit_one_cycle.

How are you inspecting the weights of the body? Are all or only some changing?


Along with what @Pablo said, you need to explicitly pass in a splitter yourself. So to your learner also do splitter=default_split, and do learn.freeze() immediately after creation


Thanks for the replies everyone!

I have created a very short colab script that reproduces the error: Google Colaboratory

Let me know if you can find where the bug is! It really does seem to me that fit_one_cycle is modifying parameters in the body even though we called freeze (and thus expect only the parameters in the head to be modified)

1 Like

I think @muellerzr has already provided the solution just do:
learn = Learner(dls, model, loss_func=CrossEntropyLossFlat(), metrics=accuracy, splitter=default_split)
I have tried it in your provided notebook and it works as expected.


Thanks everyone! That helps a lot. I was so confused before.