Using pack_padded_sequence when padding LSTM inputs

Hey,

I saw some recommendations to use pack_padded_sequence when padding LSTM inputs to make sure the padding won’t affect the LSTM output.
Is it necessary? Is anyone using it?

Update: I built a text classifier similar to the IMDB classifier. The predictions were different when padding was added to the text. It makes the results inconsistent, e.g. predicting a batch of texts together will give different results than prediction each one separately.

Thanks!

Can you provide your experience?
I used pack_padded_sequence. It seems good.

I’m still working on it. Did you use MultiBatchRNN?
It seems a bit tricky to combine the two

Nope, I write a customize RNN module to work with credit card transaction in Home credit challenge.

1 Like

Thanks. I’m not really sure how @jeremy ignored the padding. It affects the average pooling for example