How do I add my custom callback-based Metric to the ordinary Metrics list?

Hello people,

I created my own metric (F1) and want to get it into the ordinary metrics list. Is this possible? I want to observe my metric in the ordinary progress bar and also use other callbacks that rely on metrics. For instance, I struggled to combining my callback with the save best callback. I came up with this hacky solution. Is there a proper way how to do it?

Thank you,
Johannes

@dataclass
class MyCallback(Callback):
    learn:Learner
    name:str='bestmodel'

    y_pred, y_true = [], []
    best = None
        
    def on_batch_end(self, last_output, last_target, **kwargs):
        _, idxs = torch.max(last_output, 1)
        self.y_pred += idxs.tolist()
        self.y_true += last_target.tolist()
    
    
    def on_epoch_end(self, **kwargs):
        f1_macro = sklearn.metrics.f1_score(self.y_pred, self.y_true, average='macro')

        # currently, I am saving the F1 scores in a global array
        f1s.append(f1_macro)
        
        # save best
        if self.best is None or f1_macro > self.best:
            self.best = f1_macro
            self.learn.save(f'{self.name}')
        
        #  Should I manipulate the learner directly?
        if len(self.learn.recorder.metrics) == 0:
            self.learn.recorder.metrics.append([])
        self.learn.recorder.metrics[0].append(f1_macro)
        if 'f1' not in self.learn.recorder.names:
            self.learn.recorder.names.append('f1')
            
    def on_train_end(self, **kwargs):
        self.learn.load(f'{self.name}')


learn.callbacks += [MyCallback(learn)]
1 Like

Hi there! You should look at the docs on metrics because every metric is a callback behind a scene. You just have to reset your inner variables at every on_epoch_begin, update them on every on_batch_end then store the final result in self.metric.

2 Likes

Thanks. I should I have mentioned, that I tried it with self.metric but to no avail.

Let’s go with the Precision example of the docs. So this code should add Precision do the progress bar? It’s not working for me.

class Precision(Callback):
    
    def on_epoch_begin(self, **kwargs):
        self.correct, self.total = 0, 0
    
    def on_batch_end(self, last_output, last_target, **kwargs):
        preds = last_output.argmax(1)
        self.correct += ((preds==0) * (last_target==0)).float().sum()
        self.total += (preds==0).float().sum()
    
    def on_epoch_end(self, **kwargs):
        self.metric = self.correct/self.total

learn.callbacks += [Precision()]

learn.fit(1, 1e-4)

You should pass Precision() in your list of metrics, not in callbacks.

2 Likes

This makes sense and it works now. Thank you for your help.

1 Like

Hi! May I ask what’s the last_output argument in the on_batch_end callback when it comes to a classifier? I know it’s prediction results but I notice they don’t add up to 1.

I tried to print(last_output[0].sigmoid()) and the results still don’t add up to 1. They were slightly off.

tensor([0.9866, 0.0042], device='cuda:0')
tensor([0.6430, 0.2726], device='cuda:0')
tensor([0.8823, 0.0764], device='cuda:0')
tensor([0.6292, 0.3144], device='cuda:0')
tensor([0.8878, 0.0808], device='cuda:0')

Thank you!

last_output is the output of the model, but don’t forget that the softmax (not sigmoid) is inside the loss function. If you apply softmax you should have things that add up to 1 (when n=2, sigmoid and softmax are very similar but not equals)

Thanks for the explanation! I’ve been debugging my custom metric for hours…you just save my day. :laughing:

But now this got me wonder, why doesn’t softmax appear in the model as a non-linear layer but instead be part of the loss function? I assume that’s generally how pytoch models work? What’s the thinking behind this design?

It’s just more efficient to do both at the same time I believe.

1 Like

Got it, thanks! :grin:

Sylvain, this excellent piece of documentation, starting at the bottom of the page you reference:

https://docs.fast.ai/metrics.html#Creating-your-own-metric

Is so good that I would love to see it as it’s own section in the “Tutorials”, named something like “Create your own Callback”. I’m hoping that this will break my barrier to understanding how to make my own callback, to tell me the last batch number before my GPU crashes, destroying all data with it :smile:

We’re happy to accept any PR adding tutorials :wink:

1 Like

hey @fitler I was also trying to use F1 as a metric but as a newbie I am having a hard time following the explanation: could you provide the solution line? Maybe then I will understand what you mean. Thanks a lot in advance!

The documentation on metrics no longer has a section for creating custom metrics based on the Metric class. Can you please describe how we can create custom metrics by sub-classing the Metric class?