I have just encountered a peculiar bug that will propagate from dataloader.transform()
to learner.export()
:
-
It seems when calling
dataloader.transform()
, if set size to some specific number (e.g.size=129
), it will cause thelearner.export()
to fail, giving error message:
AttributeError: Can't pickle local object 'DynamicUnet.__init__.<locals>.<lambda>'
-
However, if I put
size=128
, the error will be gone and the export will be successful … . So why does this size parameter have anything to do with thelambda
in theDynamicUnet
model? This doesn’t make sense to me.
Below is a simple code to reproduce the bug
from fastai.vision import *
camvid = untar_data(URLs.CAMVID_TINY)
path_lbl = camvid/'labels'
path_img = camvid/'images'
get_y_fn = lambda x: path_lbl/f'{x.stem}_P{x.suffix}'
codes = np.loadtxt(camvid/'codes.txt', dtype=str)
data = (SegmentationItemList.from_folder(path_img)
.split_by_rand_pct()
.label_from_func(get_y_fn, classes=codes)
.transform(get_transforms(), tfm_y=True, size=129) #<= odd number will cause error
.databunch()
.normalize(imagenet_stats))
learn = unet_learner(data, models.resnet18)
learn.export()
Corresponding Google Colab notebook:
https://colab.research.google.com/drive/1qGx4joQxSzDvv5Jq0ut05DrnmGLG_H0R
I feel it seems to be related to PyTorch’s pickle behavior and (maybe) the resampling process during the dataloader.transform
? But I’m curious if someone can point out the exact reason why that’s happening…