Lesson4 imdb text classifier not returning softmax results?

Hey Guys,

I’m trying to use lesson 4 imdb as a template to work on the spooky author kaggle comp.

At the end., I need my predictions to have a softmax result.

the last module in lesson 4 IMDB uses a PoolingLinearClassfier as it’s final layer which is defined as follows.

(1): PoolingLinearClassifier(
    (layers): ModuleList(
      (0): LinearBlock(
        (lin): Linear(in_features=600, out_features=3)
        (drop): Dropout(p=0.1)
        (bn): BatchNorm1d(600, eps=1e-05, momentum=0.1, affine=True)))

Does this mean the final prediction is the output of the batchNorm sublayer?

In order to get a softmax result. should I just add a nn.log_softmax layer after the Pooling classifier?

1 Like

I think I sucessfully added softmax to the model

def get_rnn_classifer_softmax(  bptt, max_seq, n_class, n_tok, emb_sz, n_hid, n_layers, pad_token, layers, drops, bidir=False,
                                dropouth=0.3, dropouti=0.5, dropoute=0.1, wdrop=0.5):
    rnn_enc = MultiBatchRNN(bptt, max_seq, n_tok, emb_sz, n_hid, n_layers, pad_token=pad_token, bidir=bidir,
                            dropouth=dropouth, dropouti=dropouti, dropoute=dropoute, wdrop=wdrop)
    return SequentialRNN(rnn_enc, PoolingLinearClassifier(layers, drops), MySoftmax(1))

class MySoftmax(nn.LogSoftmax):
    def forward(self, input):
        return F.log_softmax(input[0], self.dim, _stacklevel=5), input[1], input[2]

I was running into issues with softmax expecting a variable instead of the tuple that the poolingLinearClassifier is returning. so I created my own Softmax module that sends the first element of the tuple into softmax, and preserves the other 2 elements.

1 Like

The other strange thing that n_class is not used

1 Like