Fine-turning results per epoch depend on total number of epochs

I run the following two pieces of code

set_seed(42,True)
dls = ImageDataLoaders.from_name_func(path, get_image_files(path), valid_pct=0.2,
        label_func=is_tb, item_tfms=Resize(224))
set_seed(42,True)
learn = cnn_learner(dls, resnet34, metrics=[accuracy])
learn.fine_tune(1,freeze_epochs=1)

and

set_seed(42,True)
dls = ImageDataLoaders.from_name_func(path, get_image_files(path), valid_pct=0.2,
        label_func=is_tb, item_tfms=Resize(224))
set_seed(42,True)
learn = cnn_learner(dls, resnet34, metrics=[accuracy])
learn.fine_tune(1,freeze_epochs=2)

and the results for epoch number 0 are different. Specifically, I got

and

I do not see why the results for the same epoch should be different. Note I am setting the seed before loading the data as well as before creating the learner so the results should be reproducible and indeed they are within their own class of freeze_epochs. However, I would have expected the results of epoch 0 to be independent of the number of frozen epochs.

Hi @borundev
fine_tune() uses the fit_one_cycle method and therefore One-cycle scheduling under the hood. One-cycle scheduling adapts the lr depending on the epoch as shown in this picture:

If you don’t like this behavior you could write your own fine_tune method using learner.fit() and learner.freeze() / unfreeze().

Where are these plots from. The x-axis is not epoch so I do not understand what the learning rate adjustment policy is. Can you point me to the relevant piece of code? I am new to FastAI and with all the decorators its hard to find the relevant info.

The plots are from the fastai docs. The source code is available from there as well (also for Learner.fine_tune() and fit_one_cycle()).
Sorry, I should have explained the graphs a bit better, I can see how the x-axis can be confusing. The iterations of the scheduler is in most cases the number of epochs, since it is updated once per epoch. The second graph is another optimizer hyperparameter that gets adapted in a similiar fashion by the scheduler.
I can totally relate that it is quite hard to see through all the magic fastai is doing under the hood in the beginning - I’ve still got some trouble with this as well.

Thanks this is very helpful.