Using pack_padded_sequence when padding LSTM inputs


#1

Hey,

I saw some recommendations to use pack_padded_sequence when padding LSTM inputs to make sure the padding won’t affect the LSTM output.
Is it necessary? Is anyone using it?

Update: I built a text classifier similar to the IMDB classifier. The predictions were different when padding was added to the text. It makes the results inconsistent, e.g. predicting a batch of texts together will give different results than prediction each one separately.

Thanks!


(Kien Vu) #2

Can you provide your experience?
I used pack_padded_sequence. It seems good.


#3

I’m still working on it. Did you use MultiBatchRNN?
It seems a bit tricky to combine the two


(Kien Vu) #4

Nope, I write a customize RNN module to work with credit card transaction in Home credit challenge.


#5

Thanks. I’m not really sure how @jeremy ignored the padding. It affects the average pooling for example