Overfitting in RNN

I am trying to fit an RNN on a dataset which predicts whether a user going to buy a premium membership (or some other objective) given customer’s sequence of activity.


It looks like my data starts overfitting quickly, after a couple of epochs. I have tried few of the following things:-

  1. increased batch_size
  2. decreased size of embedding and RNN layer (to reduce complexity)
  3. changed the learning rate
  4. Remove bidirectionality in RNN layer.
  5. Dropout layer [Edited later]

What else can I try?

PS: I am using very basic keras code to build this model.

How do you split your data to training and validation? Any chance it’s not random?


I have Dropout(0.5) at the end. Sorry, forgot to mention it!!

Well, I tried different splits. Also swapped Training with testing.

How big is your dataset?
And how many hidden layers do you have? You can try reducing the number of layers or their size. Not sure it’s going to help but worth a try

  • train is 3168 and test is 1344. Mean sequence length is 100 with max being 700. Most of the values are less than 100.

  • I have reduced the complexity of the model. All of my variations are with one hidden layer. One of my runs uses the following code to declare model in keras:-

          max_features = 24518
          # i will do sequence padding with maxlen
          n_epochs = 10
          model = Sequential()
          model.add(Embedding(input_dim = max_features, 
          model.add(Dense(1, activation='sigmoid'))