As you can see the last number at the top of the image is not a number from 0 to 1. I know this worked for me a few weeks ago, but not sure what happened since. Does anyone know how to get the probabilities to go back to being between 0 and 1?
Which loss function are you using? I’ve noticed that LabelSmoothingCrossEntropy just shows final activations there instead of Softmax probabilities.
That’s it. I am using that function. Does anyone know how to get it to return softmax probabilities? I’m wondering if I can just change the loss function when doing the plotting?
You can do
preds,y,losses = learn.get_preds(with_loss=True)
softmax = nn.Softmax(dim=-1)
preds = softmax(preds)
interp = ClassificationInterpretation(learn, preds, y, losses)
to get SoftMaxed predictions for now. I think next release will add an option to specify activation function in
get_preds, so maybe also ClassificationInterpretation will some day have it?
Thanks! I should have thought of that.