Use a custom pre-trained classifier as backbone for unet training

  1. I want to pre-train a classification model on a large dataset.
  2. Use the above architecture as encoder for unet for semantic segmentation.
    Hoping that pretraining on a large dataset for similar domain will give good result for the small dataset we have for semantic segmentation.

Are these any resources(blogs) that explain doing the above. Any form of help is appreciated.

Code:

classif_learner = cnn_learner(dls, resnet34, metrics=error_rate)

Use the above leaner in the unet_learner
segment_learner = unet_learner(dls, classif_learner.model)

I get the error:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-34-bbbc33e9f96d> in <module>
----> 1 segment_learner = unet_learner(dls, classif_learner.model)

~/anaconda3/envs/fastai/lib/python3.6/site-packages/fastai/vision/learner.py in unet_learner(dls, arch, normalize, n_out, pretrained, config, loss_func, opt_func, lr, splitter, cbs, metrics, path, model_dir, wd, wd_bn_bias, train_bn, moms, **kwargs)
217     img_size = dls.one_batch()[0].shape[-2:]
218     assert img_size, "image size could not be inferred from data"
--> 219     model = create_unet_model(arch, n_out, img_size, pretrained=pretrained, **kwargs)
220 
221     splitter=ifnone(splitter, meta['split'])

~/anaconda3/envs/fastai/lib/python3.6/site-packages/fastai/vision/learner.py in create_unet_model(arch, n_out, img_size, pretrained, cut, n_in, **kwargs)
192     "Create custom unet architecture"
193     meta = model_meta.get(arch, _default_meta)
--> 194     body = create_body(arch, n_in, pretrained, ifnone(cut, meta['cut']))
195     model = models.unet.DynamicUnet(body, n_out, img_size, **kwargs)
196     return model

~/anaconda3/envs/fastai/lib/python3.6/site-packages/fastai/vision/learner.py in create_body(arch, n_in, pretrained, cut)
 63 def create_body(arch, n_in=3, pretrained=True, cut=None):
 64     "Cut off the body of a typically pretrained `arch` as determined by `cut`"
---> 65     model = arch(pretrained=pretrained)
 66     _update_first_layer(model, n_in, pretrained)
 67     #cut = ifnone(cut, cnn_config(arch)['cut'])

~/anaconda3/envs/fastai/lib/python3.6/site-packages/torch/nn/modules/module.py in _call_impl(self, *input, **kwargs)
725             result = self._slow_forward(*input, **kwargs)
726         else:
--> 727             result = self.forward(*input, **kwargs)
728         for hook in itertools.chain(
729                 _global_forward_hooks.values(),

TypeError: forward() got an unexpected keyword argument 'pretrained'

Thank you

Hi and welcome!

The problem here is that any learner expects a function that returns a model, not the model itself. See here, someone had the same problem.

Unfortunately, I don’t know how you can then load your pretrained weights into this model, maybe somebody else can help :slight_smile:

Hi,

using unet_learner(dls, resnet34) will already give you a U-Net with pretrained ResNet34 as backbone but I if you need a model pretrained on a different task, this kaggle Kernel gives you a walkthrough on how to build a custom U-Net with a pretrained backbone.

Otherwise, wrapping your pretrained model into a function will also load the pretrained weights.

Minimal example:

def return_model(*args, **kwargs):
    "returns a pretrained resnet50, ignores all other given parameters passed by unet_learner"
    return resnet50(pretrained = True)

learn = unet_learner(dls, return_model)

# check if weights in the first layer are the same, returns tensor(1.) if they are

torch.mean((list(learn.model.parameters())[0] ==list(m.parameters())[0]).float())

This worked.

def get_model(model, pretrained=True):
    m = model
    return m

segment_learner = unet_learner(dls, partial(get_model, model=classif_learner.model), pretrained=True)

partial(get_model, model=classif_learner.model) --> Returns the model.
Hope this is correct :slight_smile:
How do I check if the model has taken the pretrained weights, rather than random ones?