I noticed that when I plot the confusion matrix (on my validation set) and sum the numbers that it does not correspond to the number of items in my validation set.
The confusion matrix (array):
[ 0, 65]])
Adding the numbers above = 50 + 13 + 65 = 128
The number of items in the validation set:
LabelList (140 items)
x: ImageList ...
I would expect these two values to be the same or am I missing something?
Some fastai functions drop the last partial batch.
This is just a guess about what is happening - you would need to trace code to be sure.
This seems plausible, as 128 is a typical batch size (or a multiple of a typical batch size such as 32 or 64).
Thanks for your inputs! Apologies for the late reply. So yes, it does seem like the batch gets dropped. So all I did was to change my Batch Size to 70. So now my confusion matrix matches with my validation set size.