Hello,
in chapter 12 I try to replace the self developed LMModel7 by AWD_LSTM model and run into a TypeError while running fit_one_cycle()
Here is the code snippet:
learn = TextLearner(dls, AWD_LSTM, loss_func=CrossEntropyLossFlat(), metrics=accuracy)
learn.fit_one_cycle(15)
This is the full error message I get:
The dls is still unchanged
Is there any hint from you that could help me?
----> 3 learn.fit_one_cycle(15)
2 frames
/usr/local/lib/python3.7/dist-packages/fastai/callback/schedule.py in fit_one_cycle(self, n_epoch, lr_max, div, div_final, pct_start, wd, moms, cbs, reset_opt)
105 moms=None, cbs=None, reset_opt=False):
106 “Fitself.model
forn_epoch
using the 1cycle policy.”
→ 107 if self.opt is None: self.create_opt()
108 self.opt.set_hyper(‘lr’, self.lr if lr_max is None else lr_max)
109 lr_max = np.array([h[‘lr’] for h in self.opt.hypers])/usr/local/lib/python3.7/dist-packages/fastai/learner.py in create_opt(self)
147 def _bn_bias_state(self, with_bias): return norm_bias_params(self.model, with_bias).map(self.opt.state)
148 def create_opt(self):
→ 149 self.opt = self.opt_func(self.splitter(self.model), lr=self.lr)
150 if not self.wd_bn_bias:
151 for p in self._bn_bias_state(True ): p[‘do_wd’] = False/usr/local/lib/python3.7/dist-packages/fastai/torch_core.py in trainable_params(m)
602 def trainable_params(m):
603 “Return all trainable parameters ofm
”
→ 604 return [p for p in m.parameters() if p.requires_grad]
605
606 # CellTypeError: parameters() missing 1 required positional argument: ‘self’