How to train a model on one dataset. FineTune/Transfer-learn the model on another dataset?

Hello,

To explain the datasets I’m dealing with, I’ll use these terminologies :

  1. A : The bigger dataset. Contains various classes c1, c2, c3 …c7. Contains one other class called c~. This dataset can be seen as binary class dataset with c1,c2…c7 in one entire class and c~ in the other class.

  2. B. Contains data (c3) which is very similar in appearance to A. I want to make a model which can be a binary class detector between c3 and c~.

ResNet has given the best results yet. Therefore I’m looking to train the entire ResNet on dataset A since A is much bigger than B. And finally use transfer learning to finetune the last layers for dataset B.

Most of the content I can find in forums is related to using a model trained on ImageNet, and using transfer-learning for your own dataset on that pre-trained model.

What I’m trying to do is train the entire ResNet50 on A. FineTune on B. If you can please guide me for this, I would be really thankful.

1 Like

To train from scratch, you will use

m=w125_resnet50()
#(other options in fastai/models/resnet.py)
bm = BasicModel(m.cuda(), name='resnet50')
learn = ConvLearner(data, bm)

You can then use the learn object just as a pretrained one, training with learn.fit(). You can freeze all convolutional layers using learn.freeze().

1 Like

Thank you for the response!

This makes half the problem easy and that’s great.

For the next part, ie; fineTuning another dataset on this very model. What do you recommend should be done?

The final fc layer of the two tasks is different. How to modify it in fast. Ai?

To train the model on new data, you can use

learn.set_data(new_data)

If you want to train only the fc layer, you can use learn.freeze() before using learn.fit() to train on the new data. If you want to train all layers (aka finetuning), just use learn.fit().

As for Charm’s question, I believe learn.set_data() takes care of changing the final fc layer behind the scenes.

1 Like