Train few neurons in last fully connected layer

Is it possible to train few neurons in last fully connected layers of any model? For example, if the last two layers are 256,128 in size and output has 10 nodes. Then, is it possible to train last 128 neurons of 256-size layer, last 64 neurons in 128-size layer and all 10 neurons of the output?

Yes, you can! This idea is called Dropout! So you can set your dropout rate. a dropout rate of 0.5 means only 50% of your nodes will be trained at a time.
Hope this helps!

But in dropout, it will dropout random weights. Not specific weights, right?

Yes. But why would you require to train specific neurons? How would you know which neurons are to be trained or not. The way i understand this, Neurons represent very arbitrary things, which individually dont make sense. The collective aggregate of neurons is what gives meaning to a layer! How would you decide which neurons should be trained or not!

I want to see the effects of training few specific neurons in the model.

You cant do that in fastai. If you want to see the effect of training a model with only half its capacity, i’d suggest you simply create a new model with half the hidden neurons.

See this discussion here:

It’s pytorch, but you can still then use it in fastai (probably with some effort since it’s nitty gritty). @PalaashAgrawal it gets tricky because simply setting a full layer to non-trainable (what fastai does) isn’t quite what he’s describing :slight_smile: (or simply splitting the layer into half the neurons, however that is a good idea!)

2 Likes

oh yes, i got that.
I meant, creating a model with half the neurons per layer(so instead of …-256-128-10 neuron model, creating a … 128-64-10 model) might be a solution, if at all that serves the idea @awais980 was looking for. Though yes, the link you provided gives a better, more generalized solution.
Sorry for any misunderstanding. Cheers!

2 Likes