Why Confusion Matrices calculated with two different ways varies in number?

I realized that it depend which method you use you are getting different confusion matrix

for example If I calculate confusion matrix via confusion_matrix method from sklearn package I get this :

however if I use: interp.confusion_matrix from fastai i get this :

any clue why is that?

hey @Shahinfar,
i think its related to the use of TTA (Test Time Augmentation). you used it in the sklearn, but in the interp the default is tta = False.

1 Like

@mhanan Thanks , its clear now!