Train_loss, valid_loss, error_rate on a saved/loaded model

(Apoorv Parle) #1

From the first lesson, when I usually do a learn.fit_one_cycle(...) metrics train_loss, valid_loss, error_rate are printed out.
But if I’m loading a model from a previously saved model, how do I calculate just the metrics again, or directly print them if they are stored with the model? I don’t want to retrain, the model, I just want to print out the metrics for the loaded model. And if I’m understanding it correctly, calling fit or fit_one_cycle will retrain the model for an epoch.

1 Like

(Vishesh Dembla) #2

Check out the page from PyTorch tutorials

https://pytorch.org/tutorials/beginner/saving_loading_models.html#saving-loading-a-general-checkpoint-for-inference-and-or-resuming-training

It shows that you can save loss during model.save and load it back again. I think it should work with fastai models as well, but not sure, you might have to try that out.

0 Likes

(Apoorv Parle) #3

Isn’t there a way to just re-calculate them on the data set. Maybe as a part of the ClassificationInterpretation object ?

0 Likes

(Andrei Ungureanu) #4

I had the same problem and used the confusion matrix from the ClassificationInterpretation object to calculate the error_rate

0 Likes

(Apoorv Parle) #5

Can you share your code here?

0 Likes

(Andrei Ungureanu) #6

yeah, sure. just use this line:

round(1-sum(interp.confusion_matrix().diagonal())/interp.confusion_matrix().sum(),6)

1 Like

(Aisha Khatun) #7

Has anyone figured out how to do it in fast ai, or that if its possible?

0 Likes