No. Pretrain = True will load weights from an already trained model, `learn.freeze()`

tells the learner to only change the FC weights during learning epochs, and `precompute = True`

will tell the learner that it should take each image of the training set and compute what the output of the frozen layers is for the image, and to store that and re-use it during training.

Maybe I can give you a simpler example using math notation. Let’s say you have a model that chains together two functions: `y = f(g(x))`

. This means we first apply `g`

to `x`

, and then `f`

to the output of `g`

.

Now imagine that, in general, we have to learn both `f`

and `g`

. But someone already figured out a good way to do `g`

. So we only need to train `f`

. That means during learning, we change around what `f`

actually does. But because someone already trained `g`

, we never change what `g`

does. So now our training set has LOTS of values for `x`

. Instead of taking them every time and plugging them into `g`

to compute `g(x)`

, we realize that we only need to do that once. So we have values `x_1, x_2, x_3, ...`

and we use those to compute activations `g(x_1), g(x_2), g(x_3), ...`

and then during training, whenever our data loader says "Okay, time to see what `f(g(x_2))`

is, we don’t have to compute `g(x_2)`

, we just plug in the precomputed value.

This only works because during learning only `f`

gets changed, not `g`

.

These things are interrelated in some way: First of all, if you don’t use `pretrained`

AND `freeze`

, it doesn’t make sense to use `precompute`

, because in the example above, if we use precompute but then `g`

changes because of training, then we can’t use the precomputed values anymore.

However, what *can* make sense is to first start with `pretrain = True`

etc, but then once you’ve trained the model that way, you can then unfreeze the layers and allow the learner to tweak those pretrained layers a little bit as well, to get a bit of extra performance. Because maybe those pretrained weights were good, but not perfect, for your particular image task.