I need help with the multi-class logarithmic loss for this dated Kaggle competition. Microsoft Malware Classification Challenge (BIG 2015) | Kaggle
I have a tabular table set up using BCEWithLogitsLossFlat but struggle to understand if it is the best fit within these options fastai - Loss Functions
TL;DR- I am still horrible at loss functions and choosing the best one.
CrossEntropyLossFlat is what you are looking - it is equivalent to the loss listed on Kaggle. BCE would have been fine were the task binary, i.e., only two possible labels, but that is not the case for this dataset. The cross-entropy loss is simply a generalization of BCE for multi-class classification.
Please let me know if you have more questions.
There is a difference between multi-class and multi-label classification. In the multi-label case there could be more than one true class or label for each input. BCE is useful for multi-label classification but this is not the case.