How to use mean average accuracy as metric in image classification?

HI all,

I am trying to use the mean average accuracy as the metric, instead of accuracy. And I find there is a metric name ‘Precision’ seems doing the work. But when I use
metric=Precision in create_cnn,
it has the following error,
AttributeError: ‘Precision’ object has no attribute ‘detach’

What else should I do if I wish to use Precision?

I also try to add a mean_acc below, but the value is always nan. Do not know the reason yet.

def mean_accuracy(input:Tensor, targs:Tensor)->Rank0Tensor:
“Compute mean accuracy with targs when input is bs * n_classes.”
preds = input.argmax(-1).view(-1).cpu()
targs = targs.cpu()

n_classes = input.shape[-1]
x = torch.arange(0, n_classes)
cm = ((preds == x[:, None]) & (targs == x[:, None, None])).sum(dim=2, dtype=torch.float32)
prec = torch.diag(cm) / cm.sum(dim=0)

return prec.mean()

Thanks,
Regards

I think this works metrics=Precision(average='macro')

1 Like

You have to instantiate a class to get an object of that class, so here you need to pass Precision() (or as @AlisonDavey pointed out Precision(average='macro') to get the average you want).

1 Like

Thanks @AlisonDavey and @sgugger for your advice, it works.

By the way, regarding my created metric mean_accuracy, why it does not work? Am I doing something wrong?

Regards,
liwei

Have you read https://docs.fast.ai/metrics.html#Creating-your-own-metric ? The metrics are calculated on each batch. Whenever there is a class in a batch with no images then you would divide by 0 and therefore get nan.

Precision is a little complicated because you shouldn’t simply take the average over the batches. Fortunately, we have Precision() so don’t need to rewrite this.

Also worth reading is this great post on metrics.

Thank you so much @AlisonDavey. It is much clearer now. By the way, there is another metric class named ‘KappaScore’ , which seem doing similar like ‘Precision’. What are the pros/cons of ‘KappaScore’ compared with Precision?

Regards,

I tried using the Kappa Score metric but I got this at the end of training one epoch, is there something obvious that I am missing?

You’re not using the last version of fastai. This bug has been fixed since then.

Thank you! That was indeed the issue.