Loss of data using get_preds()

I have noticed that, when I use the method get_preds() to infer a set of unseen data, some predictions get lost.
This is the code:
test_data = ImageList.from_df(test_df, PATH)
preds, y = learn.get_preds(test_data)

Where test_data contains 1821 examples, but “preds” is 1366, so 455 are lost.
I would like to know:
Is it correct to pass “test_data” as parameter of get_preds?
Could this issue is about DataSet.Fix? Get_preds returning less results than length of original dataset
(I am using a batch size of 512)