Save model with lowest validation loss

I’m learning fastai.

When using fit_one_cycle, does the fastai automatically choose model with lowest validation loss ?
If not, how to retrieve model having lowest validation loss ?

data = ImageDataLoaders.from_folder(data_directory, ds_tfms=batch_tfms, size=16)
learn = vision_learner(data, arch=arch, metrics=error_rate,)

Screenshot from 2022-08-02 15-37-26

how to retrieve model having lowest validation loss ?

Hopefully someone can add a more authoritative response. I actually asked the same question on the forums during the Live Course but sorry I can’t find the post. The general answer was “don’t do that” because of similarities to overfitting - some epochs “randomly” better-fit the corners of your validation data without the model having learnt the general rules required to provide good predictions on new data outside its test/validation sets.

Check out this post by @meanpenguin:

Considering your bouncing error_rate, it might be:

  • overfitting by training too much on limited data, in which case the data augmentation techniques described in the Lesson Videos may help
  • or it may be too high a learning rate might such that you are bouncing out of the minima up the sides of the solution valley shown below, in whch case it may help to using the Learning Rate Finder (

[Edit:] Naive side question for my own learning, why are you using fit_one_cycle() ?
Throughout the Course I only saw Jeremy use fine_tune().
My understanding is that with fit_one_cycle() you get less benefit from “transfer learning”
Here is a related post.

1 Like

In another thread I saw something that might relate to your question…

1 Like


If, despite bencoman’s explanation, you would like to retrieve the model with the lowest validation loss, you could use fastai’s SaveModelCallback. It monitors a particular statistic during training (e.g., validation loss) and saves the best iteration of the network. Please refer to the documentation for more information.

Please let me know if you had other questions.