Unet-Learner-for KeyPoint regression

def custom_unet(dls, arch, loss_func=None, pretrained=True, cut=None, splitter=None, config=None, n_in=3, n_out=None,
             normalize=False, **kwargs):
"Build a unet learner from `dls` and `arch`"
if config is None: config = unet_config()
meta = model_meta.get(arch, _default_meta)
body = create_body(arch, n_in, pretrained, ifnone(cut, meta['cut']))
size = dls.one_batch()[0].shape[-2:]
if n_out is None: n_out = get_c(dls)
assert n_out, "`n_out` is not defined, and could not be infered from data, set `dls.c` or pass `n_out`"
if normalize: _add_norm(dls, meta, pretrained)
model = CustomUnet(body, n_out, size, **config) # HERE
learn = Learner(dls, model, loss_func=loss_func, splitter=ifnone(splitter, meta['split']), **kwargs)
if pretrained: learn.freeze()
return learn    

learn = custom_unet(dls, resnet34, loss_func=MSELossFlat())
learn.summary()

First unet_config() was not defined, so I included it manually, now I am having this error:

RuntimeError: running_mean should contain 198 elements not 396

Somewhere in your custom UNet (probably in the U-Net block), a BatchNorm layer does not receive input in the shape it expects. It was initialized with nn.BatchNorm2d(198) but now receives a tensor of shape B x 396 x H x W. It could be the BatchNorm which normalizes the hook in the UNet-Block, but that is just a wild guess.

So basically unet_config got deleted, I’ll need to think about how to adjust that code (as I presume this is from walk with fastai?)

In the meantime we can do:

@delegates(models.unet.DynamicUnet.__init__)
def unet_config(**kwargs):
    "Convenience function to easily create a config for `DynamicUnet`"
    return kwargs

So you get a unet_config again.

The head should be fixed so long as that’s the case

Thanks, Mueller, I fixed this issue same way.

But, still this error, I am struggling with:
In Hybridizing models notebook
RuntimeError: running_mean should contain 198 elements not 396

Thanks…

Pushed a commit, this should be fixed now :slight_smile:

Basically in the custom unet there was this little line:

nf = num_features_model(nn.Sequential(layers[-1])) *2

We need to remove that *2 as fastai does some magic for us now

Thanks Mueller, will check it.