New AdamW optimizer now available

I didn’t have any problem with
opt_fn = partial(optim.Adam, amsgrad=True)
(or whatever you want to put).

1 Like

The reason is that the instance of optimizer is created inside of the Learner by calling opt_fn and passing all the needed parametes to it, so opt_fn must be a function which returns the optimizer (not the actual instance as you did).

1 Like

Hi,

I need to use adam , but this line does not work :NameError: name ‘ConvLearner’ is not defined
learn = ConvLearner.pretrained(arch, data, precompute=False, opt_fn=optim.Adam)
lr = 0.01
wd = 0.025
learn.fit(lrs=[lr/100, lr/10,lr], n_cycle=3, wds=[wd/100, wd/10, wd], use_wd_sched=True, cycle_len=1, cycle_mult=2)

How can I just replace SCG with Adamw?
Thanks