bn_unfreeze(True)

Looks like bn stands for batch normalization (a concept I believe we’re going to go over in the next lecture), and also Jeremy talks about bn_freeze in the following post: [Adv] Significant changes to fastai just pushed

From the post, Jeremy says:

I discovered that inceptionresnet-v2 and inception-v4 were not training well on dogs v cats after unfreezing. I think I’ve tracked it down to an interesting issue with batchnorm. Basically, updating the batchnorm moving statistics causes these models to fall apart pretty badly. So I’ve added a new learn.bn_freeze(True) method to freeze all bn statistics. This should only be called with precompute=False, and after training the fully connected layers for at least one epoch. I’d be interested to hear if people find this new option helps any models they’ve been fine-tuning.

Finally, I’ve changed the meaning of the parameter to freeze_to() so it now refers to the index of a layer group, not of a layer. I think this is more convenient and less to learn for students, since we use layer groups when we set learning rates, so I think this method should be consistent with that.

And it’s discussed here: Freezing batch norm, but this discussion seems more advanced and hard for me to understand.

I don’t see how we’re able to tell which layers are batch normal layers, but perhaps somehow bn_freeze(False) only unfreezes layers that are batch normal, and that’s the difference between that and unfreeze.

2 Likes