In DynamicUnet, Learner.export() will fail if set dataloader.transform(size = some_odd_number)

I have just encountered a peculiar bug that will propagate from dataloader.transform() to learner.export():

  • It seems when calling dataloader.transform(), if set size to some specific number (e.g. size=129), it will cause the learner.export() to fail, giving error message:
    AttributeError: Can't pickle local object 'DynamicUnet.__init__.<locals>.<lambda>'

  • However, if I put size=128, the error will be gone and the export will be successful … . So why does this size parameter have anything to do with the lambda in the DynamicUnet model? This doesn’t make sense to me.

Below is a simple code to reproduce the bug

from fastai.vision import *
camvid = untar_data(URLs.CAMVID_TINY)
path_lbl = camvid/'labels'
path_img = camvid/'images'

get_y_fn = lambda x: path_lbl/f'{x.stem}_P{x.suffix}'
codes = np.loadtxt(camvid/'codes.txt', dtype=str)

data = (SegmentationItemList.from_folder(path_img)
        .split_by_rand_pct()
        .label_from_func(get_y_fn, classes=codes)
        .transform(get_transforms(), tfm_y=True, size=129) #<= odd number will cause error
        .databunch()
        .normalize(imagenet_stats))
learn = unet_learner(data, models.resnet18)
learn.export()

Corresponding Google Colab notebook:
https://colab.research.google.com/drive/1qGx4joQxSzDvv5Jq0ut05DrnmGLG_H0R

I feel it seems to be related to PyTorch’s pickle behavior and (maybe) the resampling process during the dataloader.transform ? But I’m curious if someone can point out the exact reason why that’s happening…

1 Like

I have pinpointed the source for this bug! So apparently, the size in the ‘dataloader.transform()’ has to be an even number (i.e. 28, 128, etc).

Otherwise if you set dataloader.transform(size=129) or 29 or any other odd number, the error message `AttributeError: Can’t pickle local object ‘DynamicUnet.init..’ will appear. (It took me quite a while to pinpoint the source of this error …).

I feel it seems to be related to PyTorch’s pickle behavior and (maybe) the resampling process during the dataloader.transform ? But I’m curious if someone can point out the exact reason why that’s happening…

1 Like

DynamicUnet should have input sizes that are a multiple of 32.

2 Likes

Thank you @digitalspecialists for the clarification! Right yes, that make perfect sense.

2 Likes