Making metrics work for NaNs

I ran into a case recently where I wanted to print precision and fbeta values on a highly unbalanced dataset with a large number of classes. Unlike the sklearn equivalents, the fastai metrics don’t seem to have a way of handling cases where these might be NaN, e.g. a precision score of 0/0. While I was able to overcome this for myself by just overriding the classes, this seems like a simple thing to allow for. A possible way of handling this is to just insert the line prec[prec != prec] = 0 in the FBeta and Precision classes to coerce NaN values to 0. If needed, you can put a warning that this is occuring like sklearn does, e.g. UndefinedMetricWarning: F-score is ill-defined and being set to 0.0 in labels with no true samples. Here’s how I did it:

    def _precision(self):
        sums = self.cm.sum(dim=0)
        prec = torch.diag(self.cm) / sums
        prec[prec != prec] = 0
        if self.average is None:
            return prec 
        else:
            weights = self._weights(avg=self.average)
            return (prec * weights).sum()
2 Likes