UNet batchnorm_2d init

The batch normalization layers used by the DynamicUnet in fastai use the following initialization:

def batchnorm_2d(nf:int, norm_type:NormType=NormType.Batch):
    "A batchnorm2d layer with `nf` features initialized depending on `norm_type`."
    bn = nn.BatchNorm2d(nf)
    with torch.no_grad():
        bn.bias.fill_(1e-3)
        bn.weight.fill_(0. if norm_type==NormType.BatchZero else 1.)
    return bn

Where does the 1e-3 come from ?

Thanks !

1 Like