New training schedule API in fastai

On Jeremy’s request I’ve written a new class in the library to add more flexibility in the way we design our training schedule. Instead of having to choose between SGD with restarts, Cyclical LRs or 1cycle, you can design pretty much anything you want.

I’ve pulled up a notebook to work as a tutorial of the new functions, but the basics are very simple. Training is split into phases, and each phase is represented by an object called TrainingPhase. In it you specify the length of your phase, the optimizing function, the hyper-parameters, if you want the LR to change and if so, how, same for the momentum.
I tried to make it very easy to read so it could give something like:

TrainingPhase(epochs=2, opt_fn=optim.Adam, lr=(1e-2,1e-3), lr_decay=DecayType.LINEAR)

Once you have written all your training phases, which allows you to switch optimizer/learning rate/way of changing the lr as much as you want, you just run

learn.fit_opt_sched(phases)

where phases is the list of your TrainingPhase objects.

Hope it can help!

23 Likes

Wow! Thanks dude!

This is awesome! I had to do some very hacky coding to replicate the schedule for the SWA paper, and this would make it really straightforward now.

This is very useful thank you :grin:

How to use this in fastai v1 api ?