TypeError: activation() missing 1 required positional argument: 'x'

I’m trying to run my data on a custom dataset. This dataset has input tensors with 5 channels.
So after creating my dataloaders, i do the following

after which i do

at which i get the error:
'TypeError Traceback (most recent call last)

----> 1 learn.summary()

3 frames

/usr/local/lib/python3.6/dist-packages/fastai/callback/hook.py in summary(self)
    189     "Print a summary of the model, optimizer and loss function."
    190     xb = self.dls.train.one_batch()[:self.dls.train.n_inp]
--> 191     res = module_summary(self, *xb)
    192     res += f"Optimizer used: {self.opt_func}\nLoss function: {self.loss_func}\n\n"
    193     if self.opt is not None:

/usr/local/lib/python3.6/dist-packages/fastai/callback/hook.py in module_summary(learn, *xb)
    164     #  thus are not counted inside the summary
    165     #TODO: find a way to have them counted in param number somehow
--> 166     infos = layer_info(learn, *xb)
    167     n,bs = 64,find_bs(xb)
    168     inp_sz = _print_shapes(apply(lambda x:x.shape, xb), bs)

/usr/local/lib/python3.6/dist-packages/fastai/callback/hook.py in layer_info(learn, *xb)
    150         train_only_cbs = [cb for cb in learn.cbs if hasattr(cb, '_only_train_loop')]
    151         with learn.removed_cbs(train_only_cbs) as l:
--> 152             with l: r = l.get_preds(dl=[batch], inner=True, reorder=False)
    153         return h.stored

/usr/local/lib/python3.6/dist-packages/fastai/learner.py in get_preds(self, ds_idx, dl, with_input, with_decoded, with_loss, act, inner, reorder, cbs, **kwargs)
    236             if act is None: act = getattr(self.loss_func, 'activation', noop)
    237             res = cb.all_tensors()
    238             pred_i = 1 if with_input else 0
    239             if res[pred_i] is not None:
--> 240                 res[pred_i] = act(res[pred_i])
    241                 if with_decoded: res.insert(pred_i+2, getattr(self.loss_func, 'decodes', noop)(res[pred_i]))
    242             if reorder and hasattr(dl, 'get_idxs'): res = nested_reorder(res, tensor(idxs).argsort())

TypeError: activation() missing 1 required positional argument: 'x'

learn.loss_func.activation is defined as follows:
Signature: CrossEntropyLossFlat.activation(x)
Source: def activation(self, x): return F.softmax(x, dim=self.axis)


I believe loss functions need to be initialized, like this:

Could you please let us know if that solves the problem? :slight_smile:

1 Like

You were right! Thanks a lot for your help!

1 Like