Why are predictions I get from get_preds() and TTA() are positive when the final layer in my model is a LogSoftmax?

I am right now doing the part 1 course for deep learning from FastAI and I tried to replicate the DogsBreed Classifier on FastAI v1.
The first thing I noticed that the function create_cnn doesn’t automatically add the final softmax layer needed for multiclass classification. Secondly, the loss function by default is also FlatLoss.

So I edited the model and added an extra log softmax and a CE Loss function as the loss function.

Now after this when I tried to see the output from the model it gives out output that is indeed negative, but the log_preds I get from TTA are now positive.


How can the output of Logsoftmax be positive? Is there a change in how the TTA works in FastAI v1. I may be asking a real noob question, but I would be thankful if I can be pointed in the right direction

Fastai v1 is absolutely not compatible with v0.7 code as it was completely rewritten.
Here your model doesn’t have the softmax because it’s in the loss function (as is often the case with pytorch) but then get_preds still returns the predictions (as it’s name indicates) which is why you have positive numbers (it applies the softmax to the output of the model for you).

Thanks, I get it now, It seems I failed to notice that the output of get_preds add up to 1.