I haven’t used the fastai library myself but the PyTorch DataLoader that is used by fastai has an option to drop_last. If this is true, it does not include the final mini-batch if that is not a full batch. That seems to be what is happening here. Perhaps there is an option to turn this off in fastai?