Focal Loss error: no implementation found for 'torch.nn.functional.cross_entropy'

Hi,
I would like to know why when I run FocalLoss for the instance segmentation task:

class FocalLoss(nn.Module):
    def __init__(self, gamma=GAMMA):
        super().__init__()
        self.gamma = gamma
    
    def forward(self, input, target):
        ce_loss = F.cross_entropy(input, target)
        p_t = torch.exp(-ce_loss)
        loss = (1 - p_t) ** self.gamma * ce_loss
        return loss.mean()

I get this error:
TypeError: no implementation found for 'torch.nn.functional.cross_entropy' on types that implement __torch_function__: [<class 'fastai.torch_core.TensorImage'>, <class 'fastai.torch_core.TensorMask'>]

I have read many different topics about this issue, but I still have not a clear understanding of it. I suppose the problem regards the F.cross_entropy() function. Has a solution been found to this problem?

1 Like

Update:
I have found this useful implementation of the FocalLoss (plus DiceLoss and CombinedLoss), but it works only with unet_learner() and not with the generic Learner(). How could I adapt the code to the Learner()? I am also trying to change the __call__ method with forward().

Link: https://colab.research.google.com/github/fastai/fastai/blob/master/nbs/01a_losses.ipynb#scrollTo=Yt6xQZFD5rX8

1 Like

Follow
I have the same problem …

1 Like

Consider using FocalLossFlat()

That works for me

1 Like

Thanks!
Unfortunately, I still get the same error. I suppose the issue is about the Fastai's implementation.

The error says that the input to the cross entropy is fastai TensorImage and a TensorMask. To use torch cross entropy, I think you would need to convert the fastai TensorImage to a torch tensor, and the TensorMask to a torch Tensor.

2 Likes

Exactly you can do that for instance by wrapping your tesor(s) with fastai’s own TensorBase().

3 Likes

Thanks
I have solved using torch.as_tensor().