def custom_unet(dls, arch, loss_func=None, pretrained=True, cut=None, splitter=None, config=None, n_in=3, n_out=None,
normalize=False, **kwargs):
"Build a unet learner from `dls` and `arch`"
if config is None: config = unet_config()
meta = model_meta.get(arch, _default_meta)
body = create_body(arch, n_in, pretrained, ifnone(cut, meta['cut']))
size = dls.one_batch()[0].shape[-2:]
if n_out is None: n_out = get_c(dls)
assert n_out, "`n_out` is not defined, and could not be infered from data, set `dls.c` or pass `n_out`"
if normalize: _add_norm(dls, meta, pretrained)
model = CustomUnet(body, n_out, size, **config) # HERE
learn = Learner(dls, model, loss_func=loss_func, splitter=ifnone(splitter, meta['split']), **kwargs)
if pretrained: learn.freeze()
return learn
Somewhere in your custom UNet (probably in the U-Net block), a BatchNorm layer does not receive input in the shape it expects. It was initialized with nn.BatchNorm2d(198) but now receives a tensor of shape B x 396 x H x W. It could be the BatchNorm which normalizes the hook in the UNet-Block, but that is just a wild guess.
So basically unet_config got deleted, I’ll need to think about how to adjust that code (as I presume this is from walk with fastai?)
In the meantime we can do:
@delegates(models.unet.DynamicUnet.__init__)
def unet_config(**kwargs):
"Convenience function to easily create a config for `DynamicUnet`"
return kwargs
So you get a unet_config again.
The head should be fixed so long as that’s the case