Finetuning ResNet BatchNorm

Hi Jeremy and everyone, Thank you for good course.

from source code of FastAI

def freeze_to(self, n:int)->None:
    "Freeze layers up to layer group `n`."
    for g in self.layer_groups[:n]:
        for l in g:
            if not self.train_bn or not isinstance(l, bn_types): requires_grad(l, False)
    for g in self.layer_groups[n:]: requires_grad(g, True)
    self.create_opt(defaults.lr)

All batchnorm_layers are unfreeze during finetune. Could you explain why?

and What is self.train_bn? Thank you

Is it good to doing like this step in other convolution_base. I have plan to finetuning like this on EfficientNetB3 in Keras. Thank you