Mixup learner prediction throws reduction argument error


(Bharadwaj Srigiriraju) #1

I am on fastai master with development branch installed, and I am facing an error when I try to use mixup with a learner and predict from it.

learn = create_cnn(data, models.resnet18, metrics=error_rate)
learn.fit_one_cycle(1)
learn_mx = mixup(learn=learn) # either this or using learn.mixup() - both result in same error
learn_mx.fit_one_cycle(1, max_lr=slice(1e-2))

interp = ClassificationInterpretation.from_learner(learn)

throws an error:

TypeError                                 Traceback (most recent call last)
<ipython-input-22-aa7f7b70a42b> in <module>
----> 1 interp = ClassificationInterpretation.from_learner(learn)

~/projects/fastai/fastai/vision/learner.py in from_learner(cls, learn, ds_type, sigmoid, tta)
    113     def from_learner(cls, learn:Learner, ds_type:DatasetType=DatasetType.Valid, sigmoid:bool=None, tta=False):
    114         "Create an instance of `ClassificationInterpretation`. `tta` indicates if we want to use Test Time Augmentation."
--> 115         preds = learn.TTA(with_loss=True) if tta else learn.get_preds(ds_type=ds_type, with_loss=True)
    116         return cls(learn.data, *preds, sigmoid=sigmoid)
    117 

~/projects/fastai/fastai/basic_train.py in get_preds(self, ds_type, with_loss, n_batch, pbar)
    209         lf = self.loss_func if with_loss else None
    210         return get_preds(self.model, self.dl(ds_type), cb_handler=CallbackHandler(self.callbacks),
--> 211                          activ=_loss_func2activ(self.loss_func), loss_func=lf, n_batch=n_batch, pbar=pbar)
    212 
    213     def pred_batch(self, ds_type:DatasetType=DatasetType.Valid, pbar:Optional[PBar]=None) -> List[Tensor]:

~/projects/fastai/fastai/basic_train.py in get_preds(model, dl, pbar, cb_handler, activ, loss_func, n_batch)
     37     res = [torch.cat(o).cpu() for o in
     38            zip(*validate(model, dl, cb_handler=cb_handler, pbar=pbar, average=False, n_batch=n_batch))]
---> 39     if loss_func is not None: res.append(calc_loss(res[0], res[1], loss_func))
     40     if activ is not None: res[0] = activ(res[0])
     41     return res

~/projects/fastai/fastai/torch_core.py in calc_loss(y_pred, y_true, loss_func)
    205         setattr(loss_func, 'reduction', old_red)
    206         return l
--> 207     else: return loss_func(y_pred, y_true, reduction='none')
    208 
    209 def model_type(dtype):

~/anaconda3/envs/fastai/lib/python3.6/site-packages/torch/nn/modules/module.py in __call__(self, *input, **kwargs)
    475             result = self._slow_forward(*input, **kwargs)
    476         else:
--> 477             result = self.forward(*input, **kwargs)
    478         for hook in self._forward_hooks.values():
    479             hook_result = hook(self, input, result)

TypeError: forward() got an unexpected keyword argument 'reduction'

Is this a bug or am I supposed to define a custom loss function here which wouldn’t throw this error? I have a couple of lines fix for this in calc_loss in torch_core (which passes the tests) which gets it working for me.

Wanted to make sure this is bug before filing an issue and PR on Github.


(jaideep v) #2

i overcame this error by giving a extra param in custom loss function…