ducha-aiki
(Dmytro Mishkin)
1
I feel superstupid and I must be missing something super simple.
However, here is my code (all of it)
from fastai.vision import *
mnist = untar_data(URLs.MNIST_TINY)
tfms = get_transforms(do_flip=False)
data = (ImageList.from_folder(mnist)
.split_by_folder()
.label_from_folder()
.transform(tfms, size=32)
.databunch()
.normalize(imagenet_stats))
learn = cnn_learner(data, models.resnet18, metrics=accuracy,
callback_fns=BnFreeze)
print (len(learn.layer_groups))
So we have 3 layer groups, I want to freeze everything
learn.freeze_to(4)
out = learn.validate()
print (out)
[1.2018166, tensor(0.0916)]
However, something is still changing:
learn.fit(1, lr=0)
out = learn.validate()
print (out)
[0.70418966, tensor(0.5408)]
ducha-aiki
(Dmytro Mishkin)
2
Ok, find the solution. BnFreeze only freezes the moving average part of the BatchNorm.
To freeze it completely, one need to also do
learn.train_bn = False
3 Likes