Pytorch: Passing frozen parameters to optimizer?

I noticed the fastai automatically excludes all frozen parameters from the optimizer. In core.py:

def trainable_params_(m):
    return [p for p in m.parameters() if p.requires_grad]

I’m trying to learn more about how pytorch works. Can someone explain why someone would want to pass frozen parameters into the optimizer? If you did pass frozen parameters into the optimizer, what would happen? Is this some kind fo tedious oversight in the design of the pytorch API or is there a scenario which you would want to set requires_grad = False` but still include the frozen parameters in the optimizer?

Thanks for your help.

It spits out an exception if you pass frozen params. Not too handy!

1 Like

This is really helpful. Thanks!

For other’s that might find this on a search, you will get an error that looks like this:

ValueError: optimizing a parameter that doesn't require gradients

1 Like