Why two tables with epoch 0?

Hello all,

Very excited to get started on fast.ai DL course :smiley:. (Thank you Jeremy, Sylvain, Rachel and all fast.ai contributors.)

After training the pet model in chapter 1, there are two tables reporting losses, error rates, and timings. Both are for epoch 0. The second table reports better metrics too. Why are there two tables instead of just one?

2 Likes

Hey @arunr

When you run learn.fine_tune(1) you are actually training the model twice.

In transfer learning what you are doing is replacing the last few layers of the model with a new set of layers for your classification task (in this example its cats vs dogs) and then you train the model.

In the first round of training when using fine_tune you train only these newly added layers. In other words, during training you only change the parameters of the newly added layers keeping the pretrained parameters of the other layers intact. Whereas in the second round of training you train all the layers from the top.

You can try running

doc(learn.fine_tune) or ??learn.fine_tune to read more

7 Likes

Thank you :smiley:

Just to add more details after I read the book:
When we create a model from pretrained neural network and we plan to perform fine tune (learn.fine_tune), fastai will automatically freeze pretrained layers, and then:

  1. Firstly, training the randomly added layers for 1 epoch (1 complete pass of all items/samples), with all other pretrained layers frozen
  2. Secondly, unfreezing all the layers, and train all layers for the number of epochs we specified

Noted that if we use fit_one_cylcle, it will train model without using fine tune, and it will only give us one set/table of result.

2 Likes