Getting Current Train/Valid Loss Of A Learner

I have a learner I’m training. Is there any way to get its current training and validation loss? Something like this would be ideal:

TrainLoss, ValidLoss = learn.loss()

Does this exist within FastAI? I’m sure I’m just missing something. Thank you all in advance.

1 Like

you can use:

result, ep_vals =…, get_ep_vals=True)

Then you can get training and validation loss for each epoch.

You can see an example just before the Test section heading where I plot these to get a feel for how things are progressing.

Credit: @adrian

Thank you for the reply. Unfortunately, this isn’t working for me. It seems doesn’t accept that as an argument. The fit function appears to be:

def fit(self, n_epoch, lr=None, wd=None, cbs=None, reset_opt=False):
        with self.added_cbs(cbs):
            if reset_opt or not self.opt: self.create_opt()
            if wd is None: wd = self.wd
            if wd is not None: self.opt.set_hypers(wd=wd)
            self.opt.set_hypers( if lr is None else lr)
            self.n_epoch = n_epoch
            self._with_events(self._do_fit, 'fit', CancelFitException, self._end_cleanup)

Upon checking your linked github, the notebook does not load. I get the following.

Download the whole repository and navigate to that specific file and check the code, because i can’t upload the .ipynb file directly here, and you will find what you need

or maybe this helps (This is a code snippet from the repository. - line 111)

vals_s2s, ep_vals_s2s =, n_cycle=1, cycle_len=12, use_clr=(20,10), get_ep_vals=True)

def plot_ep_vals(ep_vals):
    epochs = ep_vals.keys()
    trn_losses = [item[0] for item in list(ep_vals.values())]
    val_losses = [item[1] for item in list(ep_vals.values())]
    plt.plot(epochs, trn_losses, c='b', label='train')
    plt.plot(epochs, val_losses, c='r', label='validation')
    plt.legend(loc='upper left')

I hope this helps

learn.recorder.loss has it IIRC. Its either loss or losses

1 Like