What optimizers should I use when training GANs?

Is there some standard approach on choosing optimizers when training GANs?

For example, I’ve read that you shouldn’t use Adam when training a generator, but rather use RMSProp or SGD.

I’m not sure about the critic.

However, I found that Adam works fine. Also it seems to be like fastai example codes uses Adam, for both Generator and Critic for both the pretraining and the actual trainng.

What should I use for the generator and the critic then?

This is one of those experiment and find out situations. If it works for you then I see no reason to not use it.

If they are suggesting RMSprop and not Adam then I would think they are suggesting momentum should be low. Try Adam with more momentum and see if you get bad results.

I see you are in part 1 section, Optimizers will be talked about a bit more in part 2.

Okay, that sounds good thanks!

Was still wondering if there was some standard values that just work well for most case though.

This is exactly what Fastai is meant to be… a lot of other deep learning grameworks set these values to 0, fastai has experimentally determined what values work for most cases. There is still room to change them to get better results though.

Your particular case of GANs is specific, and at that point you want to start chasing down people who are familiar with the topic.

Generally I think there might be others who can help with this, though not many people have had the time and experience to really answer this question. People in Part 2 or fastai users might be able to answer this though. I am currently working on GANs as well, but I am still experimenting and reading papers. For example, I am curious as to whether we can get some of the effect of learning rate schedulers in the GANs.