Continue training within the same cycle

Hi, consider the following:
I called fit_one_cycle let’s say for 5 epochs and I found out that the metric was still improving after finishing the cycle. I would like to train for maybe 3 more epochs. But running fit_one_cycle(3, ...) would start a new cycle. How can I continue training with the same learning rate that the cycle finished at? And preferably also within the same cycle (so the LR would continue decreasing). Or is this not possible?

In my opinion (this is not necessarily correct) there is no good reason you wouldn’t achieve the same affect by simply increasing the number of epochs in the first place.

Or another way - perform a few more epochs with additional call like so:
learn.fit_one_cycle(3, slice(1e-5)) or what ever minimal learning rate you finished the previous cycle.

1 Like

Yes, unfortunately you cannot continue training in the same cycle as you have already finished the cycle, and you really cannot extend the cycle.

1 Like

Ok thanks, by minimal learning rate you mean the learning rate that was used at the end of the cycle? I.e., not the learning rate passed to learn.fit_one_cycle(epochs, lr)?

Well it seems that just continuing training with another fit_one_cycle works just fine. There might be an initial drop in accuracy as the learning rate jumps up but it seems to stabilize to the value achieved in the previous cycle fairly quickly and improve from then on (at least for this dataset :)). Thanks for your help guys.