Fastai metrics: is cross entropy implemented?

I was looking at fastai predefined metrics here and I am missing cross entropy (i.e. logloss for multiclass classification problems).
Has somebody implemented it? I have seen there is a way to implement custom metrics but I do not understand the instructions too well. Any help?

As you mentioned, Cross Entropy usually used as a loss function. It can be seen as a metrics too but basically, it is a loss function. The cross entropy function I think is gotten from Pytorch. You can find it here: https://pytorch.org/docs/stable/nn.html

Hope that helps,

1 Like

@dhoa do you know, however, how could I add it as a custom metric (or another sklearn metric)? I am trying to figure out how to do it but from the documentation it is not clear to me. Thanks a lot in advance.

If you choose Cross Entropy as your loss fct so you can see it directly in fastai during your learning. If not, you can add it as a metric. I haven’t mastered how to do it yet. But You can try to look in the docs and see in the forum how people do it. I will try when I have time :D. Good luck

I think any PyTorch module can be used as a metric or any function for that matter. So whatever metric you are trying to use just write it in form of a PyTorch module, initialize it and then pass it to the learner. An example of custom metrics is give in lesson 7 of part one, in the Super Resolution Notebook. Check it out

hey @navidpanchi could you mention the name of the notebook? I cannot find the code liens you are referring to. In my case, I would pass an sklearn function. Thanks!