Skipping validation loss calculation

Hello,

I would like to sometimes be able to not run validation loss calculation at the end of each epoch to save time.

Is there a better way to do so than:

learn.data.valid_dl = None

(this seems to work but doesn’t seem ideal)

If I pass None to valid_ds when creating my databunch, we still have a valid loss being calculated (I imagine on the training data).

Ideally we could have the training loss be calculated every n number of epochs. This seems like a whole other story, but if it’s easy to do please let me know.

Many thanks!

1 Like

For now there is no better way than what you describe.
We’ll have a new (and more flexible) version of the Callback system after the second course has ended at the beginning of May that will make it easier.

Note that in a Callback you can store the valid_dl, set it to None and put it back every n epochs so that you validate only then.

3 Likes