Pretrain_lm error


(Amitabha) #1

when i run pretrain_lm.py
dir_path data/en_data; cuda_id 0; cl 12; bs 64; backwards False; lr 0.001; sampled True; pretrain_id
Traceback (most recent call last):
File “pretrain_lm.py”, line 53, in
if name == ‘main’: fire.Fire(train_lm)
File “/home/yhl/anaconda3/envs/fastai/lib/python3.6/site-packages/fire/core.py”, line 127, in Fire
component_trace = _Fire(component, args, context, name)
File “/home/yhl/anaconda3/envs/fastai/lib/python3.6/site-packages/fire/core.py”, line 366, in _Fire
component, remaining_args)
File “/home/yhl/anaconda3/envs/fastai/lib/python3.6/site-packages/fire/core.py”, line 542, in _CallCallable
result = fn(*varargs, **kwargs)
File “pretrain_lm.py”, line 42, in train_lm
learner,crit = get_learner(drops, 15000, sampled, md, em_sz, nh, nl, opt_fn, tprs)
File “/home/yhl/fastai/courses/dl2/imdb_scripts/sampled_sm.py”, line 85, in get_learner
m = to_gpu(get_language_model(md.n_tok, em_sz, nhid, nl, md.pad_idx, decode_train=False, dropouts=drops))
File “/home/yhl/fastai/courses/dl2/imdb_scripts/sampled_sm.py”, line 46, in get_language_model
rnn_enc = RNN_Encoder(n_tok, em_sz, n_hid=nhid, n_layers=nlayers, pad_token=pad_token,dropouti=dropouts[0], wdrop=dropouts[2], dropoute=dropouts[3], dropouth=dropouts[4])
TypeError: init() got an unexpected keyword argument ‘n_hid’


(David Ebbevi) #2

I had the same issue.

The keyword in RNN_Encoder has been changed from nhid to n_hid in a recent buggfix (like 2 weeks ago). This has not been released as a pip package which means that “pip install” installs the old version without the renaming. I would suggest you install the package by using the setup.py in the git repo, this worked for me.