I have a tabular table set up using BCEWithLogitsLossFlat but struggle to understand if it is the best fit within these options fastai - Loss Functions
TL;DR- I am still horrible at loss functions and choosing the best one.
CrossEntropyLossFlat is what you are looking - it is equivalent to the loss listed on Kaggle. BCE would have been fine were the task binary, i.e., only two possible labels, but that is not the case for this dataset. The cross-entropy loss is simply a generalization of BCE for multi-class classification.
There is a difference between multi-class and multi-label classification. In the multi-label case there could be more than one true class or label for each input. BCE is useful for multi-label classification but this is not the case.
So the FlattenedLoss of CrossEntropyLoss(), as pointed out by BobMcDear is what you need. It is the default in fastai for categorical labels.
You donβt need manualy to do One-hot encoding. Just add y_block=CategoryBlock in the TabularPandas block (and y_names = "Class"). Then the learner could be just learn = tabular_learner(dls, metrics=accuracy). You can check the selected loss function by executing a cell code learn.loss_func. It will be interesting to see if you will get similar results.
You can also try MSE Loss and not treating the label as category but as a number (donβt add y_block=CategoryBlock and this will be the default loss).