How to precompute activations on new model?

I first trained the last fc layer of a pre-trained model (say resnext101) using precompute=True. Next, I unfreeze all the layers and then trained using differential learning rates. Now I want to freeze all the previous layers and learn only the last layer which is the FC layer. For this, I want to recompute the activations till last layer (that is, I want the same functionality as precompute=True) but with the new weights and not the pretrained ones, but I am unable to find a way to do so. Any help will be appreciated.

Isn’t freeze() / freeze_to(-1) a thing that you are looking for?

Not exactly. Correct me if I am wrong though. learn.freeze() or learn.freeze_to(-1) would only make the trainable set to be false. What I want is to precompute the activations till that layer with my new model. What this will allow me is that, once I have the precomputed activations, I can directly train the last layer without having to worry about the other layers (no backprop and feed forward required from the previous layers). I am guessing that this would be much faster approach.

You can use
learn.set_data(data, precompute=True)
Which then only trains the last few layers, and keeps the weights. Is that what you meant? or do you specifically want to train to a custom point? In which case it’s a tiny bit more involved

I think learn.set_data which internally calls learn.save_fc1 is what I was looking for. Just to confirm, learn.save_fc1 will use the current model (the one I have trained for a while) for precomputing the activations right?

Thanks for your help

Yes, that’s essentially the method you want to call, and keeps the weights intact. Let me know if that works

@sjdlloyd learn.save_fc1() worked. Cheers.