TypeError: parameters() missing 1 required positional argument: 'self'

in chapter 12 I try to replace the self developed LMModel7 by AWD_LSTM model and run into a TypeError while running fit_one_cycle()
Here is the code snippet:

learn = TextLearner(dls, AWD_LSTM, loss_func=CrossEntropyLossFlat(), metrics=accuracy)

This is the full error message I get:
The dls is still unchanged

Is there any hint from you that could help me?

----> 3 learn.fit_one_cycle(15)

2 frames

/usr/local/lib/python3.7/dist-packages/fastai/callback/schedule.py in fit_one_cycle(self, n_epoch, lr_max, div, div_final, pct_start, wd, moms, cbs, reset_opt)
105 moms=None, cbs=None, reset_opt=False):
106 “Fit self.model for n_epoch using the 1cycle policy.”
→ 107 if self.opt is None: self.create_opt()
108 self.opt.set_hyper(‘lr’, self.lr if lr_max is None else lr_max)
109 lr_max = np.array([h[‘lr’] for h in self.opt.hypers])

/usr/local/lib/python3.7/dist-packages/fastai/learner.py in create_opt(self)
147 def _bn_bias_state(self, with_bias): return norm_bias_params(self.model, with_bias).map(self.opt.state)
148 def create_opt(self):
→ 149 self.opt = self.opt_func(self.splitter(self.model), lr=self.lr)
150 if not self.wd_bn_bias:
151 for p in self._bn_bias_state(True ): p[‘do_wd’] = False

/usr/local/lib/python3.7/dist-packages/fastai/torch_core.py in trainable_params(m)
602 def trainable_params(m):
603 "Return all trainable parameters of m"
→ 604 return [p for p in m.parameters() if p.requires_grad]
606 # Cell

TypeError: parameters() missing 1 required positional argument: ‘self’

1 Like

You need to pass TextLearner a model. AWD_LSTM is a function that will help you create a model, but isn’t a model by itself.

You should be able to do something like this (with your real values):

model = AWD_LSTM(vocab_sz=10, emb_sz=10, n_hid=10, n_layers=10)

It’s working. Thanks!
I still had to change the TextLearner to Learner, because I don’t used a pretrained model (data from chapter 12, Human numbers).