It’d be better to do a start_epoch
param or something similar in each fit_
function, and when it’s value is anything other than 0 it will add the SkipToEpoch
callback in the call to the inner fit
.
This would only enhance the current API, and you’d perform something like so:
learn = Learner(...)
learn.load("my_checkpoint.pth")
learn.fit_one_cycle(3, start_epoch=1)
This was also how fastai v1 was done.
Also, if you want to hop on the Discord to discuss this more interactively, feel free to join here: fast.ai