Calling language_model_learner with the config parameter kills notebook by running out memory

Anyone notice that calling language_model_learner with the config parameter kills notebook by allocating large memory.

# I used default values except emb_sz
config = dict(emb_sz=300, 
              n_hid=1150,
              n_layers=1,
              pad_token=1, 
              qrnn=False, 
              bidir=False, 
              output_p = 0.1,
              hidden_p=0.1,
              input_p=0.2, 
              embed_p=0.02, 
              weight_p=0.15, 
              tie_weights=True,
              out_bias=True
             )
learn = language_model_learner(data_lm, AWD_LSTM, drop_mult=0.5, pretrained=False, config=config)

But without the config parameter it works:

learn = language_model_learner(data_lm, AWD_LSTM, drop_mult=0.5, pretrained=False)

Was anyone faced similar issue?

Late to get to this. If you have not solved it already, then have a look at the end of this: