How to swap out optimizers during training?

I wanted to test switching optimizers during training but I can’t get the current optimizer to unload and use the new one?

I did:
learn.opt_func = new_optimizer

but the old one continues to run. (fastai 1 for reference).

we can swap datasets via similar ( = new_data) but the optimizer is clearly handled differently.

Is there another step? I’d like to avoid having to stop training, save the model without the opt and reload :slight_smile:


@LessW2020 try doing learn.opt instead. The source in Learner has this hint:

def create_opt(self, lr:Floats, wd:Floats=0.)->None:
        "Create optimizer with `lr` learning rate and `wd` weight decay."
        self.opt = OptimWrapper.create(self.opt_func, lr, self.layer_groups, wd=wd, true_wd=self.true_wd, bn_wd=self.bn_wd)

If you look also during fit (this is it calls self.opt()

1 Like

Awesome, thanks a ton @muellerzr!

1 Like

Unfortunately after swapping it complains about missing set_stat:

‘’’~/anaconda3/lib/python3.7/site-packages/fastai/callbacks/ in on_train_begin(self, epoch, **kwargs)
30 for k,v in self.scheds[0].items():
31 v.restart()
—> 32 self.opt.set_stat(k, v.start)
33 self.idx_s = 0
34 return res

AttributeError: type object ‘AdamW’ has no attribute ‘set_stat’


saving with_opt=False and then reloading doesn’t work either…you end up with a model that will not run as no matter what optimizer you piont it to, ,you get the issue with no set_stat attribute…

so looks like you can’t change optimizers??

You can change optimizers. It looks like you are try to use a PyTorch one though, which needs to be wrapped inside a class to work with fastai2 (Adam in v2 is AdamW by the way). The best way to do this would be

self.learn.opt_func = new_fastai_opt_func
1 Like

Thanks very much @sgugger!