Leaner.splitter and freeze_to in v2

I am trying split my layers to be able to freeze some of them.

I want to pass in a pre-trained model called questions as a sub-module of my main model. I want to freeze questions and fine tune it later. Here is how I do it:

class ParentChildModel(Module):
    def __init__(self, questions):
		self.questions = questions
        self.results = TabularModel(results_emb_szs, len(results_cont_names), 10, [200, 50], ps=[0.01, .1], embed_p=0.04, bn_final=False)
        self.head = nn.Sequential(*[LinBnDrop(60, 1, p=0., bn=False), SigmoidRange(*[-.1, 1.1])])
		
	...
		
def  split_layers(m):
    return L(m.questions, m.results, m.head).map(params)

learnQuestions = load_learner('models/NextQuestionPrediction_export.pkl')
learner = Learner(dls, ParentChildModel(learnQuestions.model), loss_func=combined_loss, splitter=split_layers)
learner.freeze_to(1)

Freezing to 1 here should only freeze m.questions right?

Thanks,

1 Like

That is correct.

1 Like

What got me confused is that learner.freeeze_to allows me to do freeze_to(-5), even though I only have 3 groups. So I thought I did not understand something…

Thanks!

There are no checks the groups actually exist, yeah. In this case, it just unfreezes everything.

3 Likes

I have a question related to this.

What does .map(params) do? I understand that this constructs a new list of existing elements of m. But what exaclty is params and where is it defined?

Ok, I just found out. It is simply a predefined function that returns the layer parameters.

def params(m):
    "Return all parameters of `m`"
    return [p for p in m.parameters()]

Splitting of models for training is also explained in this video.

1 Like