Freeze, unfreeze, freeze_to after adding custom head

A quick question (I hope).

I make a unet_learner and append several custom head layers to learner.model. Does this mean that the various freeze/unfreeze methods will no longer work properly because learner.layer_groups is not updated?

If so, what is a reasonable way to deal with freezing and unfreezing layers?

Thanks!

You will have to dig in the code and define your custom splits. Look at the functions in text.learner to get some ideas.

Does freezing and unfreezing help in tabular or collab models too?
If yes why isn’t he using the in those models?
@sgugger

Tabular models are all one layer group and there is no pretrained model for them, so out the gate there is no real use for differential learning rates and freezing as each model is radically different from the last.

1 Like

Ohhh, yeah i get it.
thanks
@muellerzr