Global epoch counter in Learner

I wish there was a global epoch counter in Leaner (or anywhere else). Quite often you run multiple fits for a given problem. Callbacks (and therefore all the things that print, store modify stuff) use the epoch parameter which is local to the current fit (e.g. the CSVLogger writes the current epoch and all the metrics).
A global epoch counter, the cumulative epochs over multiple fits, should be available so that the Callbacks can access this information.

What do you think? Is this a useful addition?

I just saw the recent TrackEpochCallback that Jeremy pushed which is in a similar vein, but does not make the total epoch available to other callbacks.
See: https://github.com/fastai/fastai/blob/2ae81359b22c0cf6e0f21ecda0d0116756c5a6c2/fastai/callbacks/tracker.py#L131

I have a rework of the callback system planned very soon. I can add a global epoch flag.

1 Like

That sounds great! Any idea when you’re gonna do that? Also what else are you going to change about the callbacks?

Epochs are defined as the total number of iterations for training the machine learning model with all the training data in one cycle. In Epoch, all training … Observing the enormous spacebar counter discrepancy between epoch 99 and epoch 100 reveals that the model is already overfitting. As a general rule, the optimal number of epochs is between 1 and 10 and should be achieved when the accuracy in deep learning stops improving. The right number of epochs depends on the inherent perplexity (or complexity) of your dataset. A good rule of thumb is to start with a value that is 3 times the number of columns in your data. If you find that the model is still improving after all epochs complete, try again with a higher value.