I make a unet_learner and append several custom head layers to learner.model. Does this mean that the various freeze/unfreeze methods will no longer work properly because learner.layer_groups is not updated?
If so, what is a reasonable way to deal with freezing and unfreezing layers?
Tabular models are all one layer group and there is no pretrained model for them, so out the gate there is no real use for differential learning rates and freezing as each model is radically different from the last.