Implementation of Tranformer LSTM

Hello, I need some help to implement QNN, Transformer, TransformerXL to create language and classification model for a text classification problem.
while implementing the Transformer, I am getting the following error
learn = language_model_learner(data_lm, Transformer, drop_mult=0.5,pretrained=False)

KeyError Traceback (most recent call last)
in ()
----> 2 learn = language_model_learner(data_lm, Transformer, drop_mult=0.5,pretrained=False)
3 learn.fit_one_cycle(1, 1e-2)

1 frames
/usr/local/lib/python3.6/dist-packages/fastai/text/ in get_language_model(arch, vocab_sz, config, drop_mult)
191 for k in config.keys():
192 if k.endswith(’_p’): config[k] *= drop_mult
–> 193 tie_weights,output_p,out_bias = map(config.pop, [‘tie_weights’, ‘output_p’, ‘out_bias’])
194 init = config.pop(‘init’) if ‘init’ in config else None
195 encoder = arch(vocab_sz, **config)

KeyError: ‘tie_weights’

The same thing is happening with TransformerXL as well.

Also if anyone has pre-trained language model weights of these three, please share if you can. I am new to this forum, please don’t mind if my question is trivial.

Did you ever figure out the solution? I am having the same problem, but with AWD_LSTM.

Actually, it worked fine when I restarted the kernal.
I changed flags based on the discussion herethis for QRNN and pre-trained settings

1 Like