GAN Learning Rates with Adam

I’ve been playing around with GANs and I noticed something weird in the optimizer parameters. I wanted to ask if this behavior is normal.

I’ll create a GAN learner and train it for an epoch with one cycle and a learning rate of 2e-4. After one epoch, the optimizer parameters show the following:

opt1

So the learning rate has gone down to 2.3e-5. I assumed this was typical one cycle learning rate decay.

Then I run 20 epochs (again lr=slice(2e-4), one cycle). Looking at the optimizer again:

opt2

The learning rate has gone down to 7e-10. That’s extremely low. Is this expected behavior?