I noticed the fastai automatically excludes all frozen parameters from the optimizer. In
return [p for p in m.parameters() if p.requires_grad]
I’m trying to learn more about how pytorch works. Can someone explain why someone would want to pass frozen parameters into the optimizer? If you did pass frozen parameters into the optimizer, what would happen? Is this some kind fo tedious oversight in the design of the pytorch API or is there a scenario which you would want to set
requires_grad = False` but still include the frozen parameters in the optimizer?
Thanks for your help.