Save and load model

When I save a model using learn.save('first', with_opt=True) and then load it using learn.load('first', with_opt=True) to train the model for a few more epochs, the validation loss generally first increases for a few epochs. And the validation loss corresponding to the first epoch after loading, is usually higher than the loss at the time of saving the model.

What am I doing wrong here?

Hey,
When using fit_one_cycle it is common for the test/validation error to oscilate a little bit. I do not belive the issue you are facing is because of you saving the model. The original paper also mentions this phenomena where the validation loss increases sometimes before decreasing again.
If you have ran the ResNet 50 model for pets classifier from Lecture 1, in my case at-least, the validation also oscillated a bit.:

epoch train_loss valid_loss error_rate
1 0.548006 0.268912 0.076455 (00:57)
2 0.365533 0.193667 0.064953 (00:51)
3 0.336032 0.211020 0.073072 (00:51)
4 0.263173 0.212025 0.060893 (00:51)
5 0.217016 0.183195 0.063599 (00:51)
6 0.161002 0.167274 0.048038 (00:51)
7 0.086668 0.143490 0.044655 (00:51)
8 0.082288 0.154927 0.046008 (00:51)

As you can see the validation loss iincreased between epoch 2 and 3, and this is jeremy’s code.

This image is from the original paper, as you can see even here the test-accuracy oscillates quite a bit before converging to a minimum value.

arXiv:1803.09820
In short, I donot belive this issue is due to daving and loading the model at all but a property of fit_one_cycle itself.

Regards,
Jayam

1 Like