Hi, I would like to better understand what is “ConvLearner.pretrained” doing. As I understand it is loading the pre-computed weights for a model. Is it right? When I run it, I get something like this.

What are these numbers 141, 36, and 179?

Hi, I would like to better understand what is “ConvLearner.pretrained” doing. As I understand it is loading the pre-computed weights for a model. Is it right? When I run it, I get something like this.

What are these numbers 141, 36, and 179?

Hey Krishna,

the numbers you are seeing are the calculations for the precompute. This happens only the first time you do this (unless you delete the precomputed values).

So what this is doing is you are creating a learner which contains the model architecture incl. its weights as well as a dataset instance. Then if precompute is True, you are running the whole dataset (usually train, validation and test) through the pretrained network and calculate intermediate values. This saves you a lot of calculations if you’re only going to train the last layer first. Of course this is only useful if you’re doing some kind of transfer learning where you already have pretrained model weights.

1 Like

Thanks a lot, mimi. It helped.

Let me articulate what I understand.

- “ConvLearner.pretrained” creates a model architecture (Prepare neural network layers and assigns weights)
- If precompute=true, it calculates all the
*activations*for all the layers using all the data we have. I believe training the last layer is not happening now, rather it is happening when I do*learn.fit.*

Is this right?

Do we know what are those numbers? 141, 36, and 179?. I am ok if this is explained in the future weeks.

That sounds about right. So training is not happening. You are just running the entire dataset through your network (excluding the last layer) and then save these intermediate values. This way when you later with .fit train the only last layer, you can reuse the precalculated values.

The numbers are probably the train, validation and testset size divided by your batchsize.