Error after git pull

I just pulled the latest version of fastai and my code, which worked before, now shows:

 __init__() got an unexpected keyword argument 'fp16'

when I call:

 learn = ConvLearner.pretrained(arch, data[i], precompute=False)
 learn.fit(0.015, 30)

Any idea on why? I will try to checkout an older version, but the problem is that I don’t know how old was my local.


ERROR

TypeError                                 Traceback (most recent call last)
<ipython-input-69-a41d6b953eb0> in <module>()
      1 
----> 2 learn.fit(0.015, 30)

~/fastai/courses/dl1/fastai/learner.py in fit(self, lrs, n_cycle, wds, **kwargs)
    285         self.sched = None
    286         layer_opt = self.get_layer_opt(lrs, wds)
--> 287         return self.fit_gen(self.model, self.data, layer_opt, n_cycle, **kwargs)
    288 
    289     def warm_up(self, lr, wds=None):

~/fastai/courses/dl1/fastai/learner.py in fit_gen(self, model, data, layer_opt, n_cycle, cycle_len, cycle_mult, cycle_save_name, best_save_name, use_clr, use_clr_beta, metrics, callbacks, use_wd_sched, norm_wds, wds_sched_mult, use_swa, swa_start, swa_eval_freq, **kwargs)
    232             metrics=metrics, callbacks=callbacks, reg_fn=self.reg_fn, clip=self.clip, fp16=self.fp16,
    233             swa_model=self.swa_model if use_swa else None, swa_start=swa_start,
--> 234             swa_eval_freq=swa_eval_freq, **kwargs)
    235 
    236     def get_layer_groups(self): return self.models.get_layer_groups()

~/fastai/courses/dl1/fastai/model.py in fit(model, data, epochs, opt, crit, metrics, callbacks, **kwargs)
     74         return torch_item(raw_loss.data)
     75 
---> 76     def evaluate(self, xs, y):
     77         preds = self.m(*xs)
     78         if isinstance(preds,tuple): preds=preds[0]

TypeError: __init__() got an unexpected keyword argument 'fp16'

I returned to 7e08c799618aaf9a84987fae14603e58ea9a2fb2 and restarted the kernel and now it is working.

But as I hadn’t restarted the kernel before, now I am not sure why it was not working. Maybe it was just a problem with the kernel. I will go back to master soon.