Thank you - for a single mask with labels (0…4), the standard dice coefficient gives values greater than 1. Have you come across this? I am assuming it is only designed for binary classification perhaps.
By standard I presume you mean the one in fastai?
Looking at the code (which does say it’s for binary targets) it looks like it’s expecting a 2 channel input, so something of shape
B being batch. It takes an argmax of dim 1, which would work with any number of labels, but then it’s using
intersect = (input * targs) and
union = (input+targs) which will only work if the values are 0/1 with other values producing intersect and union of >1.
Thinking about it dice is only really defined properly for the binary case. What is union supposed to mean if values are multi-class? But you can of course treat multiple labels as multiple binary classification problems and then calculate the mean which is how dice seems to be defined in such cases. So you’d need to convert your inputs/targets to a sequence of binary values for each class (i.e.
Thank you for the reply. The Severstal Kaggle explained in this thread uses dice for it’s evaluation, hence I am trying to understand it. You’ve shared some important points… I’ll do a bit more homework to understand the code a bit more.
Note that in Severstal they actually treat each class separately, i.e. given the four classes for each input image, there are 4 rows in the training CSV and you generate 4 separate predictions. So the dice is only defined for a single image/class pair.
This doesn’t mean you have to generate predictions like that. If you use a single network for all classes then you’re better creating all 4 predictions at once, but conceptually it is 4 different binary predictions.