Global epoch counter in Learner

I wish there was a global epoch counter in Leaner (or anywhere else). Quite often you run multiple fits for a given problem. Callbacks (and therefore all the things that print, store modify stuff) use the epoch parameter which is local to the current fit (e.g. the CSVLogger writes the current epoch and all the metrics).
A global epoch counter, the cumulative epochs over multiple fits, should be available so that the Callbacks can access this information.

What do you think? Is this a useful addition?

I just saw the recent TrackEpochCallback that Jeremy pushed which is in a similar vein, but does not make the total epoch available to other callbacks.
See: https://github.com/fastai/fastai/blob/2ae81359b22c0cf6e0f21ecda0d0116756c5a6c2/fastai/callbacks/tracker.py#L131

I have a rework of the callback system planned very soon. I can add a global epoch flag.

2 Likes

That sounds great! Any idea when you’re gonna do that? Also what else are you going to change about the callbacks?