Hi,

is it possible to get the number of weights used by the classifier?

I tried using learn.model which gives me something like this:

SequentialRNN(

(0): MultiBatchRNN(

(encoder): Embedding(80002, 400, padding_idx=1)

(encoder_with_dropout): EmbeddingDropout(

(embed): Embedding(80002, 400, padding_idx=1)

)

(rnns): ModuleList(

(0): WeightDrop(

(module): LSTM(400, 1150)

)

(1): WeightDrop(

(module): LSTM(1150, 1150)

)

(2): WeightDrop(

(module): LSTM(1150, 400)

)

)

(dropouti): LockedDropout(

)

(dropouths): ModuleList(

(0): LockedDropout(

)

(1): LockedDropout(

)

(2): LockedDropout(

)

)

)

(1): PoolingLinearClassifier(

(layers): ModuleList(

(0): LinearBlock(

(lin): Linear(in_features=1200, out_features=50, bias=True)

(drop): Dropout(p=0.2)

(bn): BatchNorm1d(1200, eps=1e-05, momentum=0.1, affine=True)

)

(1): LinearBlock(

(lin): Linear(in_features=50, out_features=39, bias=True)

(drop): Dropout(p=0.1)

(bn): BatchNorm1d(50, eps=1e-05, momentum=0.1, affine=True)

)

)

)

)

Therefore I can read the dimensions of the different weight matrices and can calculate the overall used weights. Is that approach correct or is there a better approach ?

Best,

RenĂ©