Understanding cut and lr_cut in model_meta dict

One thing I continue to struggle with after some time of working through the examples is the exact mapping of the parameters
cut,lr_cut = model_meta[arch]
I understand so far, that cut is the number of modules in the pretrained models corresponding to the feature extractor (as opposed to the classifier part). This is where you cut the model - OK.
However, ‘lr_cut’ is less clear to me. I understand it is used for applying differential learning rates or for freezing and unfreezing the weights. But the details are unclear to me.
How exactly does it map to e.g. three differential learning rates?
E.g. for arch = resnet18 you get cut, lr_cut = 8, 6. So, how does 6 map to three differential learning rates?

Thanks for any hint.

Take a look at get_layer_groups function in conv_learner.py - it returns 3 groups of layers for differntial learning and lr_cut parameter controls where to make the additional split. Basically, cut parameter tells where to cut the head of the model (which will be replaced with our custom head) and lr_cut tells where to split the rest of the model into ‘middle’ and ‘tail’ groups.

2 Likes

Thanks that makes sense.