Freezing batchnorm in v1

In fastai 0.7 this was done to unfreeze a pretrained network and then freeze batchnorm:

learner.unfreeze()
learner.bn_freeze(True)

How do we freeze the batchnorm weights in fastai 1.0? This seems to work but I’m still fuzzy on the whole concept of batchnorm so I’m hoping someone can confirm:

learner.train_bn = False
learner.unfreeze()

Thanks!

See https://docs.fast.ai/train.html#BnFreeze

2 Likes