I am not sure I get how to implement a fully custom metrics in fastai2.
Here the context.
I am using this Kaggle competition as a playground, and the evaluation metric is
mean column-wise ROC AUC. In other words, the score is the average of the individual AUCs of each predicted column
I implemented the metric using the magic of callbacks, like this:
class ColumnWiseRocAuc(Callback):
def begin_epoch(self):
self.targets, self.probas = Tensor([]), Tensor([])
def after_batch(self):
preds = F.softmax(self.pred, dim=1).detach().cpu()
y = self.y[:, None].cpu()
y_onehot = torch.FloatTensor(len(y), self.dls.c)
y_onehot.zero_()
y_onehot.scatter_(1, y, 1)
self.probas = torch.cat((self.probas, preds))
self.targets = torch.cat((self.targets, y_onehot))
print(preds[0], y_onehot.shape, self.probas.shape, self.targets.shape)
def after_epoch(self):
auc = 0.0
for i in range(self.dls.c):
auc += roc_auc_score(self.targets[:, i], self.probas[:, i])
print(auc/self.dls.c)
which yields, the following:
How do I turn the above callback into a metric, e.g. add it to the progress bar alongside accuracy
?
I am looking into AccumMetric, but I am not sure where to go next.
Can you guys provide any guidance please?