Learners optimizer function from SGD to ADAM

I am trying to figure out how to change the optimizer of the model. Like in keras we have model.compile(optimizer =‘Adam’).
How do we do this in fastai?

Please help.

You can pass in your optimizer when you build the learner using opt_func=optimizer

Example:
learn = create_cnn(data, models.resnet50, metrics=[accuracy], opt_func=optim.SGD)

Edit: I also believe the default optimizer is already Adam

4 Likes

Thank you

1 Like

Hi, would you know if there’s also an easy way to set the parameters and scheduler of the optimizer? Say I want to set the momentum and weight decay of sgd, as well as reduce the learning rate each N epochs?