Display only last train_loss valid_loss

(Sachin) #1

I am hoping to only see the last training_loss, valid_loss when I do learn.fit(epochs=50). Is there a way to do this in fastai? Right now printing it into a dataframe is taking too much room and I’d rather just print out the last state instead.

Optional: It would be better to have something like: print(f'Epoch {i}, train loss {train_loss}, val loss {val_loss},', end='\r') so that I can see the numbers changing but this would be a bonus.

0 Likes

(Hudson) #2

Loss will not work in this case.

0 Likes

(Sachin) #3

Can you elaborate on that? Not sure what you mean, or what you are proposing I do.

0 Likes

(Sachin) #4

Ok so I figured it out after some digging. But before that can I say that it was a pleasure digging through fastai code. Most things were super easy to find.

The first part was that I had to do learner = Learner(..., silent=True).

Second optional part I needed a callback:

def get_loss(model, dl, loss_func):
    "Calculate `loss_func` of `model` on `dl` in evaluation mode."
    model.eval()
    with torch.no_grad():
        loss = 0
        N = 0
        for xb,yb in dl:
            if isinstance(xb, list):
                loss += loss_func(model(*xb), yb) * len(yb)
            else:
                loss += loss_func(model(xb), yb) * len(yb)
            N += len(yb)
        return loss / N
                    
class PrintLoss(LearnerCallback):
    def __init__(self, learn):
        super().__init__(learn)
        
    def on_epoch_end(self, **kwargs):
        train_loss = get_loss(self.learn.model, self.learn.data.train_dl, loss_func=self.learn.loss_func)
        val_loss = get_loss(self.learn.model, self.learn.data.valid_dl, loss_func=self.learn.loss_func)
        
        epoch = kwargs['epoch']
        n_epochs = kwargs['n_epochs']
        
        print(f'Epoch {epoch+1}/{n_epochs} Training Loss: {train_loss:.4f}, Validation Loss: {val_loss:.4f}', end='\r')
0 Likes