Duplicate fully connected layers and train model with new duplicated layers only

I am trying to make two branches in the network as shown in picture. L1 are layers in the original model. And L2 are cloned layers from L1. These are fully connected layers. I can add layers by nn.Sequential ? Moreover, how I can train only L2 layers in the network while L1 layers should not be trained? It is like how I can train network with either L2 branch or L1 branch separately.

I have tried to put two different classifiers like this:

(classifier): Sequential(
    (0): Sequential(
      (0): Linear(in_features=1280, out_features=512, bias=True)
      (1): Tanh()
      (2): Linear(in_features=512, out_features=256, bias=True)
      (3): Tanh()
      (4): Linear(in_features=256, out_features=128, bias=True)
      (5): Tanh()
      (6): Linear(in_features=128, out_features=200, bias=True)
    )
    (1): Sequential(
      (0): Linear(in_features=1280, out_features=512, bias=True)
      (1): Tanh()
      (2): Linear(in_features=512, out_features=256, bias=True)
      (3): Tanh()
      (4): Linear(in_features=256, out_features=128, bias=True)
      (5): Tanh()
      (6): Linear(in_features=128, out_features=29, bias=True)
    )
  )

How I can skip classifier[0] or classifier[1] during training the model?

You don’t “skip”, you freeze those layers via a custom splitter, similar to how we freeze the weights on the resnet during the initial training of transfer learning

1 Like

You mean custom splitter from this tutorial?
Link

If you’re using v2 yes. For v1 I don’t remember exactly how to do it.

Thank you. I am using v1.