If you look at the definition of get_preds in the basic_train.py file: https://github.com/fastai/fastai/blob/master/fastai/basic_train.py#L333, you’ll see that the choice of a final activation function is made by either using the function you pass as the parameter activ or by using the helper function _loss_func2activ. The helper function doesn’t have a mapping from your focal loss to an activation function and so noop or the identity function is returned.
thank you sir!
as far as i see, A PR has been post to the github to deal with this situation,thank you for your thanksgiving~
ps, the pr’s motivation from this url
i read 5 times to your reply, i also have the final question:
if i have to add a sofmax func to the get_preds like probabilities , when the model is trained, the return valid focal loss and metric are right as i using the default loss and metric?
because when i change the loss from BCE to Focal loss ,the valid F1 score from 0.4 to 0.6, it’s a big
advance ! …i somehow doubt the result …
You need to compare the definitions of the two losses to see what inputs they expect.
Your focal loss expects the activations from the last layer as inputs (before any softmax function). You can see this because F.log_softmax(input, dim=1) is called on the inputs in the forward function of your loss.