Error when fitting language model


(Bruce Yang) #1

Hi guys, I’m getting the error below when fitting a language model for a mini subset of IMDB:

The code:
data = text_data_from_folder(’.’, train=‘train’, valid=‘valid’)
learn = RNNLearner.language_model(data, drop_mult=0.5)
learn.fit_one_cycle(1, 1e-2)

Error stack trace:

TypeError Traceback (most recent call last)
in
1 learn = RNNLearner.language_model(data, drop_mult=0.5)
----> 2 learn.fit_one_cycle(1, 1e-2)

/data/brucey/anaconda3/envs/fastai/lib/python3.6/site-packages/fastai/train.py in fit_one_cycle(learn, cyc_len, max_lr, moms, div_factor, pct_start, wd, **kwargs)
16 cbs = [OneCycleScheduler(learn, max_lr, moms=moms, div_factor=div_factor,
17 pct_start=pct_start, **kwargs)]
—> 18 learn.fit(cyc_len, max_lr, wd=wd, callbacks=cbs)
19
20 def lr_find(learn:Learner, start_lr:float=1e-5, end_lr:float=10, num_it:int=100, **kwargs:Any):

/data/brucey/anaconda3/envs/fastai/lib/python3.6/site-packages/fastai/basic_train.py in fit(self, epochs, lr, wd, callbacks)
131 callbacks = [cb(self) for cb in self.callback_fns] + listify(callbacks)
132 fit(epochs, self.model, self.loss_fn, opt=self.opt, data=self.data, metrics=self.metrics,
–> 133 callbacks=self.callbacks+callbacks)
134
135 def create_opt(self, lr:Floats, wd:Floats=0.)->None:

/data/brucey/anaconda3/envs/fastai/lib/python3.6/site-packages/fastai/basic_train.py in fit(epochs, model, loss_fn, opt, data, callbacks, metrics)
84 except Exception as e:
85 exception = e
—> 86 raise e
87 finally: cb_handler.on_train_end(exception)
88

/data/brucey/anaconda3/envs/fastai/lib/python3.6/site-packages/fastai/basic_train.py in fit(epochs, model, loss_fn, opt, data, callbacks, metrics)
70 for xb,yb in progress_bar(data.train_dl, parent=pbar):
71 xb, yb = cb_handler.on_batch_begin(xb, yb)
—> 72 loss,_ = loss_batch(model, xb, yb, loss_fn, opt, cb_handler)
73 if cb_handler.on_batch_end(loss): break
74

/data/brucey/anaconda3/envs/fastai/lib/python3.6/site-packages/fastai/basic_train.py in loss_batch(model, xb, yb, loss_fn, opt, cb_handler, metrics)
20 if not is_listy(xb): xb = [xb]
21 if not is_listy(yb): yb = [yb]
—> 22 out = model(*xb)
23 out = cb_handler.on_loss_begin(out)
24 if not loss_fn: return out.detach(),yb[0].detach()

/data/brucey/anaconda3/envs/fastai/lib/python3.6/site-packages/torch/nn/modules/module.py in call(self, *input, **kwargs)
475 result = self._slow_forward(*input, **kwargs)
476 else:
–> 477 result = self.forward(*input, **kwargs)
478 for hook in self._forward_hooks.values():
479 hook_result = hook(self, input, result)

TypeError: forward() takes 2 positional arguments but 69 were given

The error message kind of indicates input tensors got unpacked before passing to nn.module.forward(), but I’m not sure which module it comes from since RNNCore has many modules.
Please help to let me know if some workaround/quick fix is possible? Thanks


#2

That’s because you didn’t specify a method to text_data_from_folder to group the texts. If you want a language model, you should have data_func = lm_data in the arguments there (as in the examples of the docs).


(Bruce Yang) #3

Oh, doh! Thanks. I should have read the doc more carefully…